Soldier’s Trust Required for Autonomous Systems
Army systems using artificial intelligence will require battlefield security to prevent information from being altered or blocked, says the U.S. Army Research Laboratory director, who specializes in sensors and electronic devices.
Speaking at the Army Autonomy and Artificial Intelligence Symposium and Exposition in Detroit, Philip Perconti cautioned that data can be hacked, and signals and information in the field can be altered. If that happens, soldiers will lose trust in the systems and turn them off.
His remarks came at the conference hosted by the Association of the U.S. Army’s Institute of Land Warfare about the growing embrace of robots, autonomous systems, machine learning and artificial intelligence. The two-day conference at the Cobo Center ended Nov. 29.
Artificial intelligence is shaping today’s military readiness, said a former Army business transformation director, but there is huge room for expansion. “There is much more to do but much we can do right now,” said retired Lt. Gen. Edward C. Cardon, agreeing with the need for soldiers to believe the data. “We have to trust it,” he said.
“AI is not the future. It has been here for years; it is just not evenly distributed yet,” said Charlie Greenbacker, In-Q-Tel vice president of analytics. Most of the best work in artificial intelligence is being done in the private sector, out in the open, he said. “We need to have a national strategy to compete.”
Brig. Gen. Matthew P. Easley, the U.S. Army Futures Command’s artificial intelligence task force director, said the Army is seeking an army of engineers to help.