Are we moving towards the prohibition of "killer robots"?

United Nations decide to set up expert commission on Lethal Autonomous Weapons Systems – hardest work is yet to come

Non-Governmental Organizations observing the Review Conference are welcoming the decision consentaneous. More than 60 NGOs, among them Human Rights Watch and Amnesty International, have been calling for the prohibition of Lethal Autonomous Weapons Systems for years now and have established the Campaign to Stop Killer Robots. To ban so-called killer robots is also claimed by academics and IT-specialists. So the CCW’s decision marks an important milestone, still the hardest work is laying ahead – but first of all what is it all about? 

According to the US-Pentagon’s definition, autonomous weapons systems are able to select and engage targets on their own, without any human intervention. At best, humans may interrupt a potential attack but scenarios are possible where humans can’t exercise any influence any more. Is the attack directed at other humans with the aim to kill, we speak of Lethal Autonomous Weapons Systems – so-called LAWS. Especially here critics refer to serious legal, moral and security related problems.

Therefore, NGOs insist on equipping future weapons systems with “meaningful human control” and claim the prohibition of the development, production and use of other, unregulated weapons systems. The now established group of governmental experts could suggest such a prohibition on which the convention’s member states then would have to vote.

The decision to set up the expert group was by no means certain – Russia argued repeatedly that it is too early for a potential restriction or even prohibition of Lethal Autonomous Weapons System because technological developments are not foreseeable at all. Despite these concerns, Russia finally stated not to get in the way of establishing the GGE. 

Its establishment is particularly noteworthy since issues of autonomous weapons system have been on the international agenda only recently: NGOs have been concerned with it since 2009 and in 2013 Christof Heyns, UN Special Rapporteur on extrajudicial, summary or arbitrary executions, published a report asking for a moratorium of the issue. In 2014 the first informal experts’ meeting took place under French leadership in Geneva, two more followed in 2015 and 2016 under German leadership. The purpose of these meetings was to shed light on the issue and allow NGOs and experts to present their critics, yet also state representatives could express their views. Since virtually no state argued entirely in favor of Lethal Autonomous Weapons Systems and only a few explicitly opposed them, the majority of states were interested in keeping the debate going – which is equally expressed by the Review Conference’s decision from 16th December 2016. 

Hence the most challenging part of negotiations is beginning for the CCW. Some detailed questions have to be discussed at the first expert meeting in 2017 that have been addressed only marginally to date. In spite of the above mentioned Pentagon’s definition, what exactly is considered as (lethal) autonomous weapons systems has not been fully clarified yet. The present definition is too shallow and too abstract to be applied to specific weapons systems and thus provides much room for clarification and further improvement

Furthermore, some states including the US have indicated that they find the concept of “meaningful human control” vague and intangible. Nowadays machines and computerized algorithms in many military cases already support troops to a great extent and the step to relinquish the ultimate decision of using weapons to computers is shrinking further. How much room for maneuver is actually left to humans is debatable, even if they are formally involved in the chain of command.

A focal question that to date only has been discussed marginally, covers the issue on how a possible regulation or even prohibition can be verified. In contrast to classical arms control where sheer quantities are relatively easy to control, the “degree of autonomy” of a weapons system manifests itself in the software code but not in the actual psychical hardware. Finding solutions within the scope of such a qualitative arms control that all concerned parties can trust in, will certainly issue some future challenges to experts although initial deliberations already exist. However, this difficulty is not only confined to autonomous arms systems. In almost every military hardware the software determines the performance of the system by now. In the cyber realm the software alone defines the abilities and vulnerabilities.  Therefore, arms control will certainly deal with this fundamental question to a great extent.

The most difficult obstacle poses the consent of technologically advanced states that so far have positioned themselves neutrally towards restrictions and the prohibition of LAWs. To date only 19 states have unambiguously declared themselves against lethal autonomous weapons systems and in favor of a prohibition. Among the group is no state which can internationally be ranked among innovators of military technology. In other words, those states that have already invested the most in advanced weapons systems, have yet to be convinced to partake in a prohibition. Indeed, also no technologically advanced state has argued in favor of autonomous arms systems and China even argued in support of a „legally binding protocol related to the use of LAWS“ during the last audit conference. Nevertheless the support of these states towards a restriction or a prohibition will highly depend on the solution to the aforementioned verification problem.

So a great deal of work remains to be done here by the scientific community, NGOs and the delegated experts in 2017. Activists need to get past the campaign mode now and take security concerns of more skeptical states seriously. Previous negotiation processes of arms control agreements have proved that the devil is in the detail and sometimes single words can make the difference. It will be a long and enduring process, luckily the GGE procedure runs open-ended and without unnecessary pressure. 

Even if the success of the GGE is anything but guaranteed and many obstacles still stand in the way, the overall process is highly welcomed from the perspective of arms control and can even serve as a motivation in other gridlocked areas. Even though many arms control experts see a crisis of arms control per se, the latest decision of CCW as well as the emerging debate on a prohibition on nuclear weapons proves that the majority of states still have a common interest in regulating and prohibiting certain weapons.

by Niklas Schörnig

On 16th December 2016 the fifth Review Conference of the UN-Convention on Certain Conventional Weapons (CCW) passed the resolution to establish a Group of Governmental Experts (GGE) in the following year that will be concerned with Lethal Autonomous Weapons Systems (LAWS). This expert commission will work under Indian presidency and will meet twice a year for five days.