UN panel agrees to move ahead with debate

UN panel agrees to move ahead with debate
x
Highlights

A UN panel is moving ahead with efforts to define and possibly set limits on weapons that can kill without human involvement, but not as quickly as some human rights groups and other opponents think is necessary to keep up with technological advances. 

Geneva: A UN panel is moving ahead with efforts to define and possibly set limits on weapons that can kill without human involvement, but not as quickly as some human rights groups and other opponents think is necessary to keep up with technological advances.

Advocacy groups showed a harrowing video depicting the possible threats and aired other concerns on the sidelines of the first formal U.N. meeting of government experts on Lethal Autonomous Weapons Systems, known as "killer robots." The weeklong meeting ended on Friday.

The Campaign to Stop Killer Robots, an umbrella group of advocacy groups, says 22 countries support a ban on the weapons, and the list is growing. Indian ambassador Amandeep Gill, the meeting chairman, says participants agreed to a follow-up meeting next year. U.N. officials say fully autonomous killer robots don't exist yet, in theory.

According to previous reports, more than 100 artificial intelligence entrepreneurs, led by Tesla's Elon Musk, in August urged the UN to enforce a global ban on fully-automated weapons, echoing calls from activists who have warned the machines will put civilians at enormous risk.

A UN disarmament grouping known as the Convention on Certain Conventional Weapons (CCW) held talks on the issue in Geneva. Earlier, Indian ambassador on disarmament Amandeep Gill, did say that anything resembling a ban, or even a treaty, remains far off. "It would be very easy to just legislate a ban, but I think... rushing ahead in a very complex subject is not wise," he told reporters. "We are just at the starting line."

He said the discussion, which will also include civil society and technology companies, will be partly focused on understanding the types of weapons in the pipeline. Proponents of a ban, including the Campaign to Stop Killer Robots pressure group, insist that human beings must ultimately be responsible for the final decision to kill or destroy.

They argue that any weapons system that delegates the decision on an individual strike to an algorithm is by definition illegal, because computers cannot be held accountable under international humanitarian law.

Gill said there was agreement that "human beings have to remain responsible for decisions that involve life and death". But, he added, there are varying opinions on the mechanics through which "human control" must govern deadly weapons.

The International Committee of the Red Cross, which is mandated to safeguard the laws of conflict, has not called for a ban, but has underscored the need to place limits on autonomous weapons.

"Our bottom line is that machines can't apply the law and you can't transfer responsibility for legal decisions to machines", Neil Davison of the ICRC's arms unit told AFP. He highlighted the problematic nature of weapons that involve major variables in terms of the timing or location of an attack -- for example something that is deployed for multiple hours and programmed to strike whenever it detects an enemy target.

"Where you have a degree of unpredictability or uncertainty in what's going to happen when you activate this weapons system then you are going to start to have problems for legal compliance," he said. The UN meeting also featured wide-ranging talks on artificial intelligence, triggering criticism that the CCW was drowning itself in discussions about new technologies, instead of zeroing in on the urgent issue.

"There is a risk in going too broad at this moment," said Mary Wareham of Human Rights Watch, who is the coordinator of the Campaign to Stop Killer Robots. "The need is to focus on lethal autonomous weapons", she told AFP.

The open letter co-signed my Musk as well as Mustafa Suleyman, co-founder of Google's DeepMind, warned that killer robots could become "weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways." "Once this Pandora's box is opened, it will be hard to close," they said.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS