Combat unmanned aerial vehicle (UAV) or drone technology enables aircraft to be controlled remotely and launch missile or bomb attacks on targets identified through surveillance and intelligence.

The ability to kill individuals remotely, sometimes from thousands of miles away on a different continent to the controllers, has led to concerns about the ethics of their use. The introduction of artificial intelligence (AI) has complicated the issue of ultimate responsibility for a strike.

The controversial history of armed drones

Concerns about using advanced weapons have been around since the earliest days of armed conflict – in April 1139 Pope Innocent II is said to have banned the use of the crossbow against Christians.

Precursors of armed unmanned aircraft date back to balloons loaded with explosives during the American Civil War. Recognisable modern versions came into use during the Cold War, and were used in Vietnam.

Armed drones have come into their own in recent conflicts in the Middle East where targets can be spread over a wide area and in inhospitable, inaccessible territory. The US Military has deployed armed General Atomics Predator drones widely in Afghanistan.

“The introduction of artificial intelligence (AI) has complicated the issue.”

Recent UAV strikes have also seen the emergence of the discussion of ethical issues on the international stage.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Human Rights Watch has investigated civilian casualties during Israeli strikes in the Gaza Strip, and CIA strikes against Al Qaeda targets in Pakistan and Yemen have raised controversy as they do not take place in conventional war zone.

Statistics on civilian casualties are difficult to obtain.

The US Department of Defense (DOD) does not compile statistics about the total number of civilians that have been killed by its unmanned drone aircraft, and estimates of civilian casualties do not distinguish between deaths caused by remote-controlled drones and those caused by other aircraft.

A Human Rights Watch report titled ‘Precisely Wrong’, released in June 2009, details six incidents resulting in 29 civilian deaths, among them eight children.

Laws of war

International laws of war such as the Geneva Convention govern the conduct of participants in war and require participants to limit collateral damage through proper identification of targets and distinction between combatants and non-combatants.

But rapidly advancing drone technology, especially the use of AI to identify and suggest action against targets, blurs the ability to assign accountability to an individual. Current rules of engagement require human control for the decision to strike – a ‘man in the loop’.

Blay Whitby, a lecturer in Computer Science and Artificial Intelligence at the University of Sussex and a philosopher concerned with technology ethics, worries that although AI works well in a simplified factory environment, a battlefield environment is too complex for the technology.

“The AI that the military robots are using is pretty primitive,” he says. “If you’ve got an AI device that has very limited intelligence, you can simplify its environment in a factory, but on the battlefield you’re putting an essentially limited technology into a very complex and fast-moving environment. Will things go wrong? Sure they’ll go wrong; it’s just a question of what.”

However, despite the growing opportunity for automation, in some ways using armed drones enables humans to work as a team to better verify a target and act on it.

Elizabeth Quintana, senior research fellow with the Air Power and Technology group at think tank Royal United Services Union (RUSI) argues: “UAVs are more accurate and better monitored because they can now stay up in the air for up to 24 hours, operated by crews working in shifts of eight hours at a time. Pilots are deployed to theatre for two or three months at a time whereas the crews that are operating the Reapers and Predators who are based in California are deployed for three years at a time.”

“Armed drones have come into their own in recent conflicts in the Middle East.”

Standard crews consist of a pilot and weapons operator, and now increasingly include an image analyst who is further trained in the cultural norms of the region which is under surveillance. One incident that highlighted the need was when farmers were mistakenly targeted by an air strike in Afghanistan because it was their practice to dig in culverts by the side of the road in the early hours during harvest season to irrigate their fields. Their behaviour was mistaken as planting IEDs.

However, remote UAV operating crews are reported as having complained of combat stress, despite being far from the battlefield. This could be down to any number of factors, including the increased responsibility to get a target right, the inequitable nature of the conflict where the pilots are not personally putting themselves at risk, or the lengthy deployments.

Man-in-the-loop

The conventions of war dictate that a human must be ultimately responsible for the decision to deploy lethal force, dubbed ‘man-in-the-loop’ in robotics terms. However, technology may see the military wanting to shift more decision-making to the software, or ‘man-on-the-loop’.

“In this scenario, the decision to strike would be automated; the human would just be monitoring the process and have the ability to stop the deployment of lethal force,” says Whitby. “I would assume there would be a high level command to go to automated mode so perhaps the responsibility would be shifted higher up the command chain.”

Nevertheless, intelligence gathering and decision making technology could ensure the target is identified far more accurately than by a pilot who is flying an aircraft at the same time as carrying out an attack.

“The armed drone offers a number of different ways to obtain imagery to analyse, supplemented with signals intelligence such as intercepted data and mobile phone calls,” says Quintana. “You don’t use just one source of information to target somebody; you also have that corroborated by two or three different sources.”

Asymmetric warfare

An additional criticism put forward against the use of armed drones in recent conflicts is that it is using technology so far in advance of that available to opposing forces makes the engagement fundamentally unfair; a discrepancy called ‘asymmetric warfare’ by the military.

“It’s a perception issue, as you could say the same sort of thing about using a Tornado,” says Quintana. “In Afghanistan the IEDs they’re using are extremely simple but have proven a serious threat to NATO forces. Public perception is why you have seen restraint in using armed drone in Libya.”

Whitby argues that the very use of superior technology can provoke escalation. “If you go into a battlefield in a very high-tech way, as the US and NATO have tended to do for the last ten years, you give the enemy little choice but to go to different types of warfare,” he says. “They’re either going to be intimidated by your technology, or they’re going to engage in an arms race and find ways to get round the technology using asymmetric warfare.”

Room for improvement

“Remote UAV operating crews are reported as having complained of combat stress.”

Armed drones have become a well-established part of a modern military arsenal, but questions remain unanswered about the ethics of their use. A number of options are available to address the key issues and ensure the rules of engagement remain relevant.

“Guidelines are lagging terribly behind keeping up with advances in technology,” says Whitby. “That’s not because people are lazy, it’s because things have moved very fast in this area and technology is notoriously hard to predict.”

He adds: “I’d like the military to be talking to the public more. I can’t help suspecting that a lot of this technology is profitable but not effective and the defence industry is foisting it on a less than enthusiastic military. Senior military ranks in the military don’t want it because it confuses the chain of command and responsibility. Lower ranks are worried about risking their lives to retrieve drones that crash behind enemy lines.”

Quintana agrees the matter should be more consultation on the matter. “There needs to be much more of a public discussion about the role of automation, but there should at least be an open debate rather than scaremongering,” she says.

Ultimately, the issue for the military is to be able to attribute any decision to use lethal force to an individual, which is the same whether a drone uses AI to select a target or whether someone on the ground issues coordinated to a piloted aircraft. The use of drones will continue to abide by the rules of engagement so long as they remain flexible enough to incorporate and regulate the development of any new technology they feature.