Essay Warns of AI Limits on Land Combat Systems

Essay Warns of AI Limits on Land Combat Systems

Artificial intelligence has broad military applications, but effectiveness on land combat systems could be limited in the opening stages of the next war, warns a new essay from the Association of the U.S. Army’s Institute of Land Warfare.

Written by Lt. Col. Stephan Pikner as part of an Army Strategist Association essay contest, the paper warns of “large blind spots” during opening stages of war because of how AI-enabled systems learn.

The problem is that large amounts of data are needed for artificial intelligence to make intelligent decisions, and that information is not necessarily gathered in combat training.

“Training an artificially intelligent system in a controlled environment such as the [National Training Center in California], where the machine’s classifications of opposing force targets can be updated with surety, may result in overly strong prior beliefs of enemy characteristics,” Pikner writes. “The more ambiguous evidence about the enemy’s signatures and locations in real-life combat may struggle to overturn these strongly formed prior beliefs. Systemic cases of false negatives, in which both the human trainer and the AI classification system fail to correctly identify a real-life threat, may create large blind spots in the ability of Army systems to find adversary forces.”

The essay, “Training the Machines: Incorporating AI into Land Combat Systems,” won the Army Strategist Association contest. Pikner is an Army strategist studying at Georgetown University as part of the Advanced Strategic Planning and Policy Program.

“This is not a call to abandon research into military applications of AI,” Pikner writes. “Instead, Army leaders must recognize the limitations of machine learning in particular contexts, especially in situations in which adaptation and the recognition of causal links are critically important.”

The full paper is available here: