On 23 May 2018, the Euobserver website reported that research projects developing the next generation of weapons can now apply for EU funding as part of the EU’s 500 million Euro defence investment programme. Controversially, research projects whose activities could (among other things) result in the development of Lethal Autonomous Robots (LARS) – sometimes called ‘Killer Robots’ (mostly by their critics) – are eligible to apply under this scheme. The EU Parliament unsuccessfully tabled an amendment to prevent such (and related) research programmes from accessing EU funds. In the end, the Parliament, EU Commission, and Council of the EU agreed that weapons research programmes should only be ineligible for funding if they develop weapons that contravene international law.
To be sure, from the perspective of international law, this requirement is sound. Article 36 of the 1977 Additional Protocol I of the Geneva Conventions obliges states to ensure that they do not develop weapons that contravene International Humanitarian Law (IHL). In this sense, the EU’s approach to the funding weapons research upholds international legal standards. Still, not everyone is happy. In an emailed statement to Euobserver, the German Green MEP (and ex-co-chair of Germany’s powerful Green Party), Reinhard ‘Bueti’ Buetikofer, criticised the decision to potentially make funds available for the development of, in his view, ‘inhumane weapons’, including LARS.
Does ‘Bueti’ have a point insofar as LARS are concerned? Perhaps. After all, since 2012, there has been a vocal and visible civil society campaign, known as the Campaign to Ban Killer Robots, which argues for an international legal ban on LARS. It is in no small part due to this campaign that the prospect of LARS has been discussed in a number of UN meetings within the Convention on Certain Conventional Weapons framework, though no decision on a ban has yet been made.
Falling short of a ban, in 2013, the UN Special Rapporteur on Extrajudicial, Summary and Arbitrary Executions, the South African law professor Christof Heyns, called for a moratorium on the development of LARS. According to Heyns, it might be better to wait with the development of LARS until relevant legal and ethical issues have been clarified. Those involved in the drafting of EU’s defence investment programme do not seem to agree with the need for a moratorium; otherwise research projects aimed at developing LARS would have remained ineligible for funding.
Indeed, from a legal perspective, it is questionable to what extent, if any, LARS are different from other currently legal weapons systems. The first thing to realise is that the development of robots for, and the deployment of robots by, the military is not illegal. For instance, militaries across the world rely on robots to dismantle mines and Improvised Explosive Devices. It is hard to see what should be wrong with this, either legally or morally. But surely robotic weapons are a different kettle of fish. This is because they will be programmed to bring about a destructive and/or disabling kinetic effect in their operating environment.
Yet the devil is in the detail, for two reasons. First, not every destructive or disabling kinetic effect is necessarily lethal. Consider a missile defence system programmed to shoot down incoming missiles. Alternatively, consider an autonomous aerial vehicle (a ‘drone’) that, once programmed, searches and destroys (unmanned) enemy drones. The kinetic effects of these two systems are not intended to be lethal. Would such weapons be deemed problematic by critics of LARS? If not, this would leave a lot of room for states to develop non-lethal autonomous robotic weapons. In this respect, the EU Parliament’s amendment would have made very little difference to the trajectory of research on robotic weapons.
Second, many weapons systems – robotic or otherwise – may not be wholly autonomous but have some autonomous elements. For instance, contemporary remote-controlled aerial vehicles, aka drones, have complex sensor suites that collect vast amounts of data. Data collection is typically automated because it would overwhelm any single human operator to keep up. In future combat situations, it might be possible to automate defensive functions within a weapons system so the human operator can concentrate on carrying out a particular mission. In the world of engineering, machine autonomy is rarely an either/or issue. Again, it is unlikely that the EU Parliament’s amendment would have ruled out the development of such systems.
What, then, about (intentionally) lethal autonomous robotic weapons? Here, the crucial (legal) question is whether LARS would violate the legal requirement of distinction. In legal terms, distinction means that, in their targeting decisions, belligerents are obliged to distinguish between legitimate and illegitimate military targets. Only legitimate targets may be intentionally engaged. Arguably, the prime example in this respect is the distinction between combatants (legitimate targets) and non-combatants/innocent civilians (illegitimate targets). Only combatants may be intentionally targeted during an armed conflict. (The restriction on the intentional targeting of non-combatants is lifted in cases where civilians participate in hostilities.)
Now, contemporary robotic sensor suites can easily distinguish a human being from other entities in the robot’s operating environment (cars, houses, animals, you name it). Where robots will struggle, however, is to make the qualitative judgement as to whether a particular human individual has the status of a combatant or a non-combatant. Or at least shows hostile intent. Sometimes – if one thinks of urban warfare scenarios or Israel’s recent clashes with Palestinian protesters at the border to Gaza – this is even difficult for human soldiers. It is hard to see why machine would be better than humans in this regard.
Here, the EU’s requirement that eligible weapons research programmes must ensure their creations’ compliance with international law (Article 36) becomes important. For the foreseeable future, it is highly unlikely that robots explicitly designed to engage human combatants could comply with the requirement of distinction. Unless LARS-focused research programmes applying for EU funds could prove how they deal with the issue of distinction, they are, as things stand, likely to be ineligible for EU funding, notwithstanding the rejection of the EU Parliament’s amendment. However, as was pointed out above, this would only apply to LARS, not other robotic weapons systems.
Leaving the legal issues aside, is Mr. Buetikofer right in arguing that LARS (or robotic weapons more generally) are inhumane weapons? This is a difficult question. Consider the currently legal method of high-altitude bombing. Suppose that Gary, a pilot, flies into enemy territory at very high altitude so his aircraft is out of reach of anti-aircraft missiles. Once his aircraft is over the target area (say, military barracks in the enemy state), the onboard computer signals Gary to release his payload. Once the payload is released, Gary swiftly steers his plane out of enemy territory. By the time Gary’s payload reaches its target, the plane is already close to the border.
What would be so inhumane, one wonders, about automating the whole process by programming an autonomous drone to carry out the airstrike? After all, Gary never sees the target area and relies on his computer. As his plane speeds away, he is never directly confronted with the effects of his actions. Is it really sound to argue that it is bad to use the autonomous drone in order to destroy the barracks because machines should not make decisions about life and death? In the end, Gary also relies on a machine – his targeting system – in order to make a decision about the release of his payload. Put differently, then, high-altitude bombing may or may not be inhumane, regardless of whether it is carried out by a human pilot or by an autonomous robotic aircraft. Perhaps the debate on LARS should not merely prompt us to think about the future of armed conflict, but also about whether its present means and methods are just.
In any case, the EU’s decision regarding the funding of weapons research shows that robotic weapons remain an important consideration for many militaries. No one wants to be left behind. While a legal ban on LARS may be unlikely, not least because IHL already seems to have the resource to deal with those systems, there is clearly a need for effective robotic arms control.