A system such as the Aegis Combat System fitted with the Phalanx Close In Weapon System, for example, will automatically identify and destroy a target, but is designed to counter incoming munitions, not to be used against human beings

Mention "killer robots" and what image is conjured up? If you thought of the apocalyptic scene at the beginning of Terminator 2 where laser-toting androids crush the skulls of vanquished adversaries while battling the rag-tag remnants of humanity, you wouldn’t be alone. Judgement Day may be confined to fiction, but advanced automated weapons already make decisions about which target to engage, albeit with a human in the loop.

Prof. Christof Heyns is the UN Special Rapporteur on extrajudicial, summary or arbitrary executions and Human Rights Lawyer at the University of Pretoria, and he believes LARs raise issues concerning the right to life. He recently submitted a report to the United Nations Human Rights Council recommending member states issue moratoria on aspects of LAR use.

"Heyns believes if such systems targeted directly against humans, it could count as a summary execution."

"Because it’s a new technology there’s the opportunity to deal with this issue in advance, unlike drones where the genie’s already out of the bottle," Heyns says. "I think with drones we really got into this without thinking about it in advance, so maybe we can do it better with LARS."

Heyns accepts that there are no indications anyone is using fully autonomous robots with lethal capacity at the moment; in most cases, related technology is either not fully autonomous or it is not intended to be lethal.

A system such as the Aegis Combat System fitted with the Phalanx Close In Weapon System, for example, will automatically identify and destroy a target, but is designed to counter incoming munitions, not to be used against human beings.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Heyns believes if such systems targeted directly against humans, it could count as a summary execution.

"At the moment I think it’s too early to say it should be an illegal weapon, or that it can never meet the requirements of international humanitarian law, but I think there’s a clear danger that it could play that role," he explains.

"Indeed the expression ‘summary execution’ is just another way of saying unlawful killing. Potentially, this could be a form of killing that is unlawful under international law."

Human in the loop

To differentiate between LARs and current weapons with advanced targeting systems, Heyns uses the Human Rights Watch and US Department of Defense (DoD) definition: weapons systems that once deployed can, without further human intervention, select and engage a human target.

"Potentially, this could be a form of killing that is unlawful under international law."

"The classical difference being that with drones there’s a human in the loop who takes the targeting decision and assumes responsibility for that, versus with LARs where there is no human in the loop, just a computer on board," he explains.

One of the problems with defining what is legal when it comes to this novel type of weapon is that the international humanitarian law (IHL), otherwise known as the law of armed conflict, was drawn up before such systems were even considered possible.

There is no explicit mention that a human must be behind the decision to launch a lethal strike because the technology simply was not available when the IHL was being laid out, leaving an open question as to how to interpret this body of law in a modern context.

"It doesn’t explicitly say a human should make the decision, though it does talk about a commander’s responsibilities," says Heyns. "But do you say that because it doesn’t say explicitly there should be a human in the loop that it’s not required, or should you say that because it’s assumed that there will be a human in the loop, it’s required? That’s exactly where we are at the moment, having to try and figure that out."

Preparing for the battlefield of the future

Given this omission, Heyns makes two specific recommendations in his report. The first is that the UN will convene a High Level Panel on LARs consisting of experts from different fields, such as law, robotics, computer science, military operations, diplomacy, conflict management, ethics and philosophy, to publish a report on LARs within a year.

"I don’t think anybody’s got the whole picture," says Heyns. "As a lawyer, I certainly don’t understand all the technical aspects, and the other way round. There’s also the whole set of ethical issues that should be brought in, and simply the matter of what is militarily possible."

In the meantime, Heyns’ second recommendation is that states should declare and implement national moratoria on – in other words refrain from – at least the testing, production, assembly, transfer, acquisition, deployment and use of LARs until such time as an internationally agreed upon framework on the future of LARs has been established.

Heyns does not expect any particular voice of opposition, even from manufacturers working on related technologies.

"I haven’t encountered any specific interest group that says, ‘LARS at all costs and there should be no contemplation’," he says. "In fact, my impression from Geneva where all the different states are either directly or indirectly represented is that everybody says it’s an issue of concern."

Adoption by member states

"Heyns does not expect any particular voice of opposition, even from manufacturers working on related technologies."

Legislation against LARs has already garnered widespread support. The US adopted a directive in November 2012 to say they will ensure that there are appropriate levels of human supervision in such systems, and the parliaments of the UK and France have both debated on Heyns’ report, going beyond a moratorium to say they simply would not deploy such systems.

"There are two caveats," says Heyns. "One is that they said they would not deploy it for the time being, they didn’t exclude the possibility of doing it later."

"But the other concern is that LARs will evolve the same way as drones," he adds. "When they were first deployed, it was very clearly stated that they would only be used for reconnaissance purposes, precisely because people said it would be too dreadful to drop bombs from them. Eventually, that’s just the way things go. So yes, it’s good that we’ve got these assurances from states, but there should be a clearer constraint built into law about it."

While legislation will stay the hand of states prepared to act within the law, as with all weapons legislation there is a risk rogue states or non-state actors could get hold of LARs. The prospect of killer robots in the wrong hands represents a threat too terrible to contemplate.

Defence link

Follow Berenice Baker on Google+