The US Department of Defense (DoD) has clarified its rules on the use of autonomous armed robots in battle, stating that humans will always have the final decision on deploying lethal action.

The DoD recently announced its plans to upgrade military aiming systems, using machine learning to create a gun platform that can choose its targets autonomously.

Under the Advanced Targeting and Lethality Automated System (Atlas) project, the US Army will incorporate autonomous aiming capabilities into ground combat vehicles to help US Army gunners reach a higher level of precision.

The DoD said, however, that it remains committed to rules governing the use of armed robots, known as directive 3000.09, which requires that every use of lethal action is decided by a human, meaning that human operators can veto actions proposed by the machine.

Some commentators have expressed their concerns over the use of autonomous armed robots, not least the Campaign to Stop Killer Robots spokesperson and University of Sheffield professor of AI and robotics Noel Sharkey.

Sharkey told Army Technology: “The US DoD directive 3000.09 requires ’appropriate levels of human judgement’ but the word ‘veto’ raises alarm bells. I would not call it meaningful human control if the robot weapon chooses the target and the only role of the human is to be able to ‘veto’ that decision.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
“Meaningful human control requires the human to deliberate on potential targets to determine their legitimacy for every attack before firing on them.”

“Meaningful human control requires the human to deliberate on potential targets to determine their legitimacy for every attack before firing on them.”

The US Army said on its federal business opportunities website that “the Army has a desire to leverage recent advances in computer vision and artificial intelligence/machine learning to develop autonomous target acquisition technology, that will be integrated with fire control technology, aimed at providing ground combat vehicles with the capability to acquire, identify, and engage targets at least three times faster than the current manual process.

“The ATLAS will integrate advanced sensors, processing, and fire control capabilities into a weapon system to demonstrate these desired capabilities.”

For the project, the DoD is looking for commercial partners to help deliver the autonomous aiming system. It held an industry open day on 12 March to discuss the use of automation on the battlefield and explore how the Atlas system can be updated in accordance with existing rules.

The US Army added: “The goal of this industry day is to provide developments achieved regarding these technologies within the traditional defence community, as well as the private sector, including those firms and academic institutions outside that do not traditionally do work with the US Army.”

–Additional reporting by Robert Scammell.