December 2018, Nos. 3 & 4 Vol. LV, "New Technologies: Where To?"

History is witness that technology creates surprises on the battlefield. However, technology can blindside not only warriors but policymakers as well. The challenge of keeping pace with technology becomes even more formidable when policy is co-constructed in multilateral forums, which by their very nature require the patient build-up of common ground and consensus.

In recent years, technological fields have begun to merge and create new use scenarios in both the civil and military domains. A case in point is the formerly esoteric field of artificial intelligence (AI). Increased availability of computing power, the massive data generated by Internet-connected devices, and falling costs for data storage and manipulation have brought AI out of obscure conferences into newspaper headlines and the speeches of leaders. Techniques such as machine learning combined with the availability of large datasets and computing power to ‘train’ AI algorithms have led to machines taking on tasks once reserved for the human brain. The defeat of the 18-time world champion in Go, Lee Sedol, by DeepMind’s AI algorithm in March 2016, is a powerful symbol of this shift in the balance of power between man and machine. It was only natural that these advances in the intelligent autonomy of digital systems would attract the attention of Governments, scientists and civil society concerned about the possible deployment and use of lethal autonomous weapons. What was needed was a forum to discuss these concerns and begin to construct common understandings regarding possible solutions.

Conventional weapons-related arms control tended to play second fiddle to strategic weaponry during the cold war. This imbalance persisted, even though technology and security trends began to shift in the late 1990s. The multilateral ecosystem for dealing with advanced conventional weaponry outside of ad hoc export control regimes—such as the 1996 Wassenaar Arrangement—remained relatively underdeveloped. Fortunately, one instrument at the juncture of international humanitarian law (IHL) and arms control stands out. This is the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW).1

The Convention, negotiated under United Nations auspices in 1979–1980, has its roots in key IHL principles, such as proportionality and distinction between civilians and combatants. Currently, the Convention has five Protocols—Protocol I on Non-Detectable Fragments; Protocol II on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices (as amended on 3 May 1996); Protocol III on Prohibitions or Restrictions on the Use of Incendiary Weapons; Protocol IV on Blinding Laser Weapons; and Protocol V on Explosive Remnants of War. Thus, it has a modular design that allows new instruments to be attached to the framework treaty as humanitarian concerns around weapons systems evolve and as new systems emerge.

While discussions at human rights forums in Geneva on remotely controlled weapons in 2012-2013 were helpful in raising awareness,2 CCW turned out to be the forum of choice to discuss emerging technologies in the area of lethal autonomous weapons systems (LAWS). Its flexible nature and the balance it upholds between humanitarian principles and military necessity provided the space for States with very differing views to begin engaging on a complex and rapidly evolving technology. Its standing as an instrument of IHL, alongside the 1949 Geneva Conventions and their 1977 Additional Protocols, made it attractive to all those concerned with the potential undermining of IHL principles by autonomous combat systems. It is also helpful that all countries with established or emerging capabilities in AI systems—Canada, China, France, Germany, India, Israel, Japan, the Republic of Korea, the Russian Federation, the United Kingdom, and the United States—are High Contracting Parties to the Convention. This is not to say that the forum was without challenges. An important and continuing problem is financial stability. Arrears in payments by High Contracting Parties created uncertainty around some of the meetings in 2017. Another challenge was how to involve industry and technology developers in discussions on lethal autonomy, given the industry fear of being stigmatized, among other things. A significant mindset challenge was the tendency of the traditional arms control community to see weapons in discrete material terms. Hollywood depictions of Ironman and the Terminator did not help either. 

These issues of mindsets and cross-domain literacy were tackled first through a series of informal discussions at CCW in Geneva between 2014 and 2016. The Informal Meeting of Experts, led first by Ambassador Jean-Hugues Simon-Michel of France and later by Ambassador Michael Biontino of Germany, raised awareness of the complex dimensions of the issue—humanitarian, ethical, military, legal and techno-commercial. The fact that the CCW rules of procedure allow the participation of a broad range of stakeholders, including civil society, helped, as did the raised profile of the issue in forums outside of Geneva. Informal discussions helped build consensus on the establishment of a Group of Governmental Experts (GGE) with a formal mandate at the Fifth Review Conference of the High Contracting Parties to CCW in December 2016, chaired by Ambassador Tehmina Janjua of Pakistan.3 A significant role in elevating the conversation to the next level of maturity was played by organizations such as the International Committee of the Red Cross, which held an expert meeting on technical, military, legal and humanitarian aspects of what it tentatively called “autonomous weapons systems”, from 26 to 28 March 2014. The United Nations Institute for Disarmament Research (UNIDIR), the in-house independent research arm of the United Nations on disarmament issues, contributed by developing a primer and other briefing material for negotiators and researchers. Think tanks such as the Stockholm International Peace Research Institute, Chatham House, the Harvard Law School Program on International Law and Armed Conflict, as well as NGOs such as Human Rights Watch, the International Committee for Robot Arms Control, Article 36 and Amnesty International, made equally substantive contributions.

The first formal meeting of the Group of Governmental Experts related to emerging technologies in the area of lethal autonomous weapons systems in the context of the objectives and purposes of CCW was held in Geneva from 13 to 17 November 2017. The discussion was animated by a “food-for-thought paper” from the Chair,4 with nine other working papers from High Contracting Parties, as well as four Panels of Experts, organized around the legal, ethical, military, technical and cross-cutting dimensions of the subject. Side events held by NGOs, research institutions and States enriched the discussion with new perspectives, including from young AI entrepreneurs. At the end of the week, the participants adopted a set of conclusions and recommendations.5 One conclusion was that CCW is the appropriate framework for dealing with the issue; the other was that IHL applies fully to the potential development and use of LAWS. This was an important early assurance, although it did not settle the question of whether further legal norms were needed. The consensus conclusions also allowed the Chair to focus the agenda of the Group for 2018 on 1) characterization of the systems under consideration—the so-called definitional issue; 2) aspects of human-machine interaction, which were critical to the concern about potential violations of IHL; and 3) possible options for addressing the humanitarian and international security consequences of the implementation of such systems. Divergent views on definitions and risks, as well as possible benefits of LAWS, and approaches to regulation and control, including the idea of a pre-emptive ban, persisted, but the Chair’s summary emerged as a practical device to capture the diversity of views without blocking progress on substance through the pithier conclusions.

GGE stepped up its work in 2018 with two sessions in April and in August. At the meeting that took place from 9 to 13 April in Geneva, the Group made significant progress in reaching common understandings on the quality and depth of the human-machine interface required not only for ensuring respect with IHL but also for the eventual construction of more ambitious outcomes on human responsibility and accountability. The Group used a so-called ‘sunrise slide’ to examine the different phases of technology development and deployment, and acquire an appreciation for the work that would be required in those phases to ensure meaningful human oversight and control. With regard to characterization, the discussions enhanced common ground on the concepts and characteristics required for an eventual definition, and shifted minds away from the elusive silver bullet of a technical bright line between what is of emerging concern and what can be handled under legacy instruments. The work on common understandings and principles that was started in 2017 was continued in April 2018 and culminated in a set of possible guiding principles at the end of the August session that year. These principles are supported by a set of building blocks on characterization, on the human-machine interface and on technology review. The GGE report presents four options for policy, including a possible legally binding constraint, which can be constructed using the agreed guiding principles and the building blocks.6

The 10 principles included applicability of IHL; non-delegation of human responsibility; accountability for use of force in accordance with international law; weapons reviews before deployment; incorporation of physical, non-proliferation and cyber security safeguards; risk assessment and mitigation during technology development; consideration of the use of emerging technologies in the area of LAWS in compliance with IHL; non-harm to civilian research and development and use; the need to adopt a non-anthropomorphic perspective on AI; and the appropriateness of CCW as a framework for dealing with the issue. The building blocks on characterization include the need to maintain a focus on the human element in the use of force. The understandings on the human-machine interface are built around political direction in the pre-development phase; research and development; testing, evaluation and certification; deployment, training, command and control; use and abort; and post-use assessment. The Group agreed that accountability threads together these various human-machine touch points in the context of CCW. GGE also agreed on the need to move in step with technology and build in partnership with industry and other stakeholders a common scientific and policy vernacular across the globe.

Further work is required to foreclose possible harm to civilians and combatants in armed conflict in contravention of IHL obligations, exacerbation of security dilemmas through arms races and the lowering of the threshold for the use of force. However, the August 2018 outcome is a plus for the multilateral ethic at a challenging moment for global cooperation. It underlines the important role that United Nations forums can play in addressing the challenges posed by rapidly developing technologies by involving all important stakeholders, and by leveraging the existing foundation of international law and institutions.


1Or the ‘Inhumane Weapons Convention’ opened for signature on 10 April 1981, entered into force on 2 December 1983,

2“Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyens”, 2013 (A/HRC/23/47). Available at

3Fifth Review Conference of the High Contracting Parties to CCW, “Final Document of the Fifth Review Conference”, Decision I (CCW/CONF.V/10). Available at

4CCW/GGE.1/2017/WP.1. Available at  

5Report of the 2017 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS), Geneva, 13–17 November 2017 (CCW/GGE.1/2017/CRP.1). Available at$file/2017_CCW_GGE.1_2017_CRP.1_Advanced_+corrected.pdf

6Report of the 2018 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Geneva, 9–13 April 2018 and 27-31 August 2018 (CCW/GGE.1/2018/3). Available at$file/CCW_GGE.1_2018_3_final.pdf.