Lethal Autonomous Weapon Systems (LAWS)

What are LAWS?

At present, no commonly agreed definition of Lethal Autonomous Weapon Systems (LAWS) exists.

An overview of characterizations of these systems brought forward within the Group of Governmental Experts on LAWS can be found here: CCW/GGE.1/2023/CRP.1 (as of 2023)

Do LAWS exist?

States are increasingly developing and deploying weapons with autonomous functions. However, certain systems incorporating rudimentary autonomous functions have been in existence for decades.

The most common types of weapons with autonomous functions are defensive systems. This includes systems such as antivehicle and antipersonnel mines, which, once activated, operate autonomously based on trigger mechanisms.

Newer systems employing increasingly sophisticated technology include missile defense systems and sentry systems, which can autonomously detect and engage targets and issue warnings. Other examples include loitering munition (also known as suicide, kamikaze or exploding drone) which contain a built-in warhead (munition) and wait (loiter) around a predefined area until a target is located by an operator on the ground or by automated sensors onboard, and then attacks the target. These systems first emerged in the 1980s; however, their systems functionalities have since become increasingly sophisticated, allowing for, among other things, longer ranges, heavier payloads and the potential incorporation of artificial intelligence (AI) technologies.

Land and sea vehicles with autonomous capabilities are also increasingly being developed. Those systems are primarily designed for reconnaissance and information gathering but may possess offensive capabilities.

What is the role of artificial intelligence (AI) in LAWS?

Autonomous weapons systems require “autonomy” to perform their functions in the absence of direction or input from a human actor. Artificial intelligence is not a prerequisite for the functioning of autonomous weapons systems, but, when incorporated, AI could further enable such systems. In other words, not all autonomous weapons systems incorporate AI to execute particular tasks. Autonomous capabilities can be provided through pre-defined tasks or sequences of actions based on specific parameters, or through using artificial intelligence tools to derive behavior from data, thus allowing the system to make independent decisions or adjust behavior based on changing circumstances. Artificial intelligence can also be used in an assistance role in systems that are directly operated by a human. For example, a computer vision system operated by a human could employ artificial intelligence to identify and draw attention to notable objects in the field of vision, without having the capacity to respond to those objects autonomously in any way.

What is the position of the United Nations on LAWS?

Since 2018, United Nations Secretary-General António Guterres has maintained that lethal autonomous weapons systems are politically unacceptable and morally repugnant and has called for their prohibition under international law. In his 2023 New Agenda for Peace, the Secretary-General reiterated this call, recommending that States conclude, by 2026, a legally binding instrument to prohibit lethal autonomous weapon systems that function without human control or oversight, and which cannot be used in compliance with international humanitarian law, and to regulate all other types of autonomous weapons systems. He noted that, in the absence of specific multilateral regulations, the design, development and use of these systems raise humanitarian, legal, security and ethical concerns and pose a direct threat to human rights and fundamental freedoms.

United Nations independent experts have also expressed concerns regarding lethal autonomous weapons systems. UN Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, was the first to raise the alarm about lethal autonomous weapons systems, in a report to the Human Rights Council in 2013. UN Special Rapporteur on counter-terrorism and human rights, Fionnuala Ní Aoláin, joined the Secretary-General’s call for a global prohibition on lethal autonomous weapons systems in a report to the Human Rights Council in 2023.