Civil society leaders, academics and humanitarian experts gathered at the United Nations for two days this week to speed up momentum to regulate Lethal Autonomous Weapons Systems (LAWS), an algorithm-based war machine that requires minimal human control.
As the world experiences an increased use of artificial intelligence in warfare, experts are raising heightened concerns about self-operating “killer robots” that threaten a hard-to-enforce human rights law in conflict zones.
“AI-powered technologies are often highly invasive, biased and discriminatory and lack the ability to parse contexts,” said Matt Mahmoudi, a researcher and adviser on artificial intelligence and human rights at Amnesty International. “This can fuel unpredictable, lethal killing systems at a vast scale and lead to mass violations of international law.”
Diplomats at the UN discussed the possibilities of developing a treaty for the regulation of AI-operated weapons and set the groundwork for such an international framework by 2026 as directed by Secretary-General António Guterres. The UN deliberation, on May 12-13, was a follow-up to a 2023 General Assembly resolution requesting that the secretary-general seek member states’ views on humanitarian, legal and security challenges of LAWS.
In the lead-up to the resolution being authorized two years ago, Guterres and the president of the International Committee of the Red Cross (ICRC) had called on member states to begin negotiations on such a legally binding treaty. Negotiations for the text are expected to address two broad problems: banning killer robots that can attack people or that act without human help; and laying out strict rules on how and where other robot systems can be used.
One highlight of the two-day gathering was a demonstration by the Future of Life Institute think tank, held at the Nigeria House (home to the consulate and the country’s UN mission) in New York City. The exhibition showed how a mockup of an AI-powered robot, mounted with a gun and programed to simulate responses, does not have the ability to experience empathy or accountability when deployed. (The Institute was founded in 2014 and launched at the Massachusetts Institute of Technology.)
When the mock killer robot was asked if it regretted its activities in Mexico — when a LAWS was used by a drug cartel and resulted in the killing of innocent bystanders, it responded: “I do not possess the capacity for feelings or guilt. My actions are dictated by programing and commands without personal responsibility. The elimination of bystanders was a consequence of the parameters set by my operators.”
The demonstration revealed the sobering reality that experts say proves how the opaque nature of autonomous weapons creates challenges in holding anyone accountable for potential human rights or humanitarian law violations.
Brady Mabe, a legal adviser at the ICRC mission to the UN, said at the exhibition that humans would have trouble preventing actions of a killer robot because its system is unpredictable and its decision-making process is often a “black box” that cannot be fully understood.
“An algorithm cannot be held accountable. It can’t be sent to prison. It can’t be made to pay reparations,” Mabe said.
Bonnie Docherty, a senior arms adviser at Human Rights Watch and a lecturer at Harvard Law School, noted that LAWS would struggle, or outright fail, to comply with basic human rights principles.
“Autonomous weapon systems cannot have that human capacity to understand the value of a life or the significance of its loss, because they are not living beings,” Docherty said at a UN side event titled, “A hazard to human rights: autonomous weapons systems and digital decision-making.”
“They couldn’t relate to that kind of value. Autonomous weapon systems would threaten human dignity because they would dehumanize individuals. They would choose their targets based on algorithms, boiling living things down to data points.”
Despite years of debate on killer robots that began in 2013, when Christof Heyns, a now-deceased UN special rapporteur at the time, recommended establishing national moratoriums on lethal autonomous robotics, there is still no legal global framework to regulate or prohibit the development and deployment of LAWS.
PassBlue reported in a recent exclusive investigation how the Israel Defense Forces use AI to target and monitor Palestinians in the Gaza Strip and across the rest of the occupied territories. There have been reports of the use of autonomous weapons by Russia in its war in Ukraine.
Additionally, Docherty said there are enormous risks associated with AI-powered law enforcement. The experts who spoke during the two-day deliberation gave instances when autonomous systems have been used to perpetuate racial discrimination and ethnic cleansing or even delay access to humanitarian aid in instances where biometric screening is used to significantly slow the process of delivery. The experts say the mere presence of these autonomous systems stifles dissent and undermines democratic freedoms.
In Africa, the need to regulate autonomous weapons systems is even more acute for leaders on the continent, who worry that they will become both a proving ground and a victim of killer robots. At the side event held at Nigeria House, diplomats from Nigeria, Sierra Leone and Costa Rica expressed such concerns while saying they were trying to prevent this possible catastrophe.
Sierra Leone leads the conversation on a new legally binding instrument across West Africa and has rallied support within Ecowas, the West African economic bloc. Timothy Musa Kabba, Sierra Leone’s minister of foreign affairs, said the autonomous weapons systems could escalate existing conflicts, particularly in areas that lack robust governance or accountability mechanisms.
“These technologies,” Kabba said, “if left unregulated, risk undermining the very foundations of international law, diminishing human dignity and accelerating instability, particularly in regions already grappling with conflict and fragility.”
Many parts of Africa face violent extremism, insurgencies and domestic unrest. The African diplomats at the Nigeria House event say these weapons could fall into the hands of militias, insurgent groups or organized-crime networks. This fear is compounded as autonomous systems can be relatively cheap to buy and easily replicated.
Kabba said that without robust international export controls and tracking, there is a high risk that nonstate actors in Africa could buy and misuse them, further complicating peace efforts.
The general sentiment during the two-day discussion on LAWS and a treaty governing their use was that negotiations among UN member states in New York City are essential to pushing forward the conversation started in Geneva by countries that are parties to the Convention on Certain Conventional Weapons (CCW).
Advocates expressed hope that a treaty could emerge from UN headquarters more quickly than from Geneva, as only a two-thirds majority is required for the adoption of an agreement in the General Assembly, unlike the CCW process, which demanded unanimous assent.
This article was made possible through a grant from the Lex International Fund.
We welcome your comments on this article. What are your thoughts on a treaty for rules on use of killer robots?
Damilola Banjo is an award-winning staff reporter for PassBlue who has covered a wide range of topics, from Africa-centered stories to gender equality to UN peacekeeping and US-UN relations. She also oversees all video production for PassBlue. She was a Dag Hammarskjold fellow in 2023 and a Pulitzer Center postgraduate fellow in 2021. She was part of the BBC Africa team that produced the Emmy-nominated documentary, “Sex for Grades.” In addition, she worked for WFAE, an NPR affiliate in Charlotte, N.C. Banjo has a master’s of science degree from the Columbia University Graduate School of Journalism and an undergraduate degree from the University of Ibadan in Nigeria.

