On 16 October 2017, the Permanent Mission to the United Nations of Mexico partnered with the International Committee for Robot Arms Control, Human Rights Watch, Seguridad Humana en Latinoamérica y el Caribe and the Campaign to Stop Killer Robots to host a panel discussion entitled “Pathways to Banning Fully Autonomous Weapons” as part of the First Committee side event series for the 72nd Session General Assembly. Ambassador Juan Sandoval Mendiolea, Deputy Permanent Representative of Mexico to the United Nations, introduced the discussion by noting that Mexico has joined other states in calling for a ban on lethal autonomous weapons systems, also known as fully autonomous weapons. Referencing possible implications of for the 2030 Agenda for Sustainable Development, Ambassador Mendiolea stressed the importance of continued research and education on these weapons systems and their possible dangers.
Ms. Mary Wareham, global coordinator of the Campaign to Stop Killer Robots and moderator of the panel, described autonomous weapons as systems that select targets and use force without meaningful human control. She said the side event would consider the rationale for prohibiting lethal autonomous weapons systems, as well as pathways to concluding a new international treaty that would ban them. Professor Noel Sharkey, from the International Committee for Robot Arms Control, continued the discussion by focusing on the technologies in such systems. He mentioned the difficulty of formulating a precise definition of an autonomous weapon, and he noted that the United States Department of Defense references a need for “appropriate levels of human judgment” over these systems. Professor Sharkey urged the international community to further discuss controls on fully autonomous weapons. Countries view these weapons as a way to assert dominance, so if systems with differing algorithms meet, the outcome would be uncertain. According to Professor Sharkey, this is reason enough not to encourage use of such systems.
He added that a massive race for artificial intelligence (AI) has been underway since the 1980s, and recent increases in computational speed have contributed to techniques for interpreting “big data”, like statistical pattern recognition. With these new AI capabilities, many systems have acquired bias, mostly concerning gender and race. In areas such as predictive policing, mortgages and job applications, it is possible to witness the bias of these systems firsthand. Referring to future warfare, Professor Sharkey posed the question, “How do we distinguish between a civilian and an insurgent?” He said these machines do not reason in the same terms as humans, and it would be unwise for humanity to start using them on the battlefield.
Ms. Bonnie Docherty, senior researcher in the Arms Division at Human Rights Watch, addressed the need to focus discussions on fully autonomous weapons. Some states have argued that discussions of a potential ban cannot progress without a clear definition. According to Ms. Docherty, fully autonomous weapons engage targets independently. They are weapons where humans are “out of the loop,” she said, noting that current armed drones do not fit this definition. She advocated against such full autonomy, stating that humans should exert control not solely during the design phase, but also in specific moments when the weapons could employ force. She added that some states and leading experts foresee possible conflicts with International Humanitarian Law, including if an autonomous machine created fatalities after failing to recognize behavioral or facial indications of an inappropriate target. Such a scenario would create an issue of accountability. Would those who deploy fully autonomous weapons be considered responsible for their actions? Finally, she noted a moral concern about these machines making life-and-death decisions in relation to human dignity. Ms. Docherty concluded her remarks by reiterating the need to focus on international discussion of this technology, as these weapons systems should be preemptively banned.
Mr. Camilo Serna, from Seguridad Humana en Latinoamérica y el Caribe, closed the panel discussion by providing an overview of the progress made over the past few years in regard to stopping killer robots. He posed questions to the panelists, including, “What are the technologies contributing to fully autonomous weapons systems?” “Are autonomous systems best visualized as robots or machines?” And, “Will the international community determine if a computer can be a weapon?” Five years without a proper definition with legal clarity make it difficult to fix this problem, Serna noted.
This side event probed a number of the technical, legal, ethical, operational and other concerns over fully autonomous weapons, as well as explain why an international ban treaty is needed and what it will take to get there.
Text and photos by Gillian Linden