US Army clarifies rules on autonomous armed robots

Talal Husseini 13 March 2019 (Last Updated March 13th, 2019 11:09)

The US Department of Defense (DoD) has clarified its rules on the use of autonomous armed robots in battle, stating that humans will always have the final decision on deploying lethal action.

US Army clarifies rules on autonomous armed robots
The US Department of Defence has confirmed that a human will always be in charge of lethal decisions of autonomous weapons systems, under directive 3000.09. Credit: US Army/CDCC.

The US Department of Defense (DoD) has clarified its rules on the use of autonomous armed robots in battle, stating that humans will always have the final decision on deploying lethal action.

The DoD recently announced its plans to upgrade military aiming systems, using machine learning to create a gun platform that can choose its targets autonomously.

Under the Advanced Targeting and Lethality Automated System (Atlas) project, the US Army will incorporate autonomous aiming capabilities into ground combat vehicles to help US Army gunners reach a higher level of precision.

The DoD said, however, that it remains committed to rules governing the use of armed robots, known as directive 3000.09, which requires that every use of lethal action is decided by a human, meaning that human operators can veto actions proposed by the machine.

Some commentators have expressed their concerns over the use of autonomous armed robots, not least the Campaign to Stop Killer Robots spokesperson and University of Sheffield professor of AI and robotics Noel Sharkey.

Sharkey told Army Technology: “The US DoD directive 3000.09 requires ’appropriate levels of human judgement’ but the word ‘veto’ raises alarm bells. I would not call it meaningful human control if the robot weapon chooses the target and the only role of the human is to be able to ‘veto’ that decision.

“Meaningful human control requires the human to deliberate on potential targets to determine their legitimacy for every attack before firing on them.”

“Meaningful human control requires the human to deliberate on potential targets to determine their legitimacy for every attack before firing on them.”

The US Army said on its federal business opportunities website that “the Army has a desire to leverage recent advances in computer vision and artificial intelligence/machine learning to develop autonomous target acquisition technology, that will be integrated with fire control technology, aimed at providing ground combat vehicles with the capability to acquire, identify, and engage targets at least three times faster than the current manual process.

“The ATLAS will integrate advanced sensors, processing, and fire control capabilities into a weapon system to demonstrate these desired capabilities.”

For the project, the DoD is looking for commercial partners to help deliver the autonomous aiming system. It held an industry open day on 12 March to discuss the use of automation on the battlefield and explore how the Atlas system can be updated in accordance with existing rules.

The US Army added: “The goal of this industry day is to provide developments achieved regarding these technologies within the traditional defence community, as well as the private sector, including those firms and academic institutions outside that do not traditionally do work with the US Army.”

–Additional reporting by Robert Scammell.