The United Nations is set to host talks on the use of autonomous weapons, but those hoping for a ban on the machines dubbed “killer robots” will be disappointed, the ambassador leading the discussions said Friday.
More than 100 artificial intelligence entrepreneurs led by Tesla’s Elon Musk in August urged the U.N. to enforce a global ban on fully automated weapons, echoing calls from activists who have warned the machines will put civilians at enormous risk.
A U.N. disarmament grouping known as the Convention on Certain Conventional Weapons (CCW) will on Monday begin five days of talks on the issue in Geneva.
But anything resembling a ban, or even a treaty, remains far off, said the Indian ambassador on disarmament, Amandeep Gill, who is chairing the meeting.
“It would be very easy to just legislate a ban but I think … rushing ahead in a very complex subject is not wise,” he told reporters. “We are just at the starting line.”
He said the discussion, which will also include civil society and technology companies, will be partly focused on understanding the types of weapons in the pipeline.
Proponents of a ban, including the Campaign to Stop Killer Robots pressure group, insist that human beings must ultimately be responsible for the final decision to kill or destroy.
They argue that any weapons system that delegates the decision on an individual strike to an algorithm is by definition illegal, because computers cannot be held accountable under international humanitarian law.
Gill said there was agreement that “human beings have to remain responsible for decisions that involve life and death.”
But, he added, there are varying opinions on the mechanics through which “human control” must govern deadly weapons.
Machines ‘can’t apply the law’
The International Committee of the Red Cross, which is mandated to safeguard the laws of conflict, has not called for a ban, but has underscored the need to place limits on autonomous weapons.
“Our bottom line is that machines can’t apply the law and you can’t transfer responsibility for legal decisions to machines,” Neil Davison of the ICRC’s arms unit told AFP.
He highlighted the problematic nature of weapons that involve major variables in terms of the timing or location of an attack — for example, something that is deployed for multiple hours and programmed to strike whenever it detects an enemy target.
“Where you have a degree of unpredictability or uncertainty in what’s going to happen when you activate this weapons system, then you are going to start to have problems for legal compliance,” he said.
Next week’s U.N. meeting will also feature wide-ranging talks on artificial intelligence, triggering criticism that the CCW was drowning itself in discussions about new technologies instead of zeroing in on the urgent issue.
“There is a risk in going too broad at this moment,” said Mary Wareham of Human Rights Watch, who is the coordinator of the Campaign to Stop Killer Robots.
“The need is to focus on lethal autonomous weapons,” she told AFP.
The open letter co-signed by Musk as well as Mustafa Suleyman, co-founder of Google’s DeepMind, warned that killer robots could become “weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”
“Once this Pandora’s box is opened, it will be hard to close,” they said.