Accessibility links

Breaking News
USA

Human Rights Watch Campaigns Against 'Killer Robots'


An undated U.S. Air Force image shows a MQ-1 Predator unmanned aircraft.
An undated U.S. Air Force image shows a MQ-1 Predator unmanned aircraft.
Technology is moving fast when it comes to "autonomous systems" - intelligent machines that perform tasks with little or no human guidance.

In modern warfare, drones and other unmanned vehicles are playing an increasing role, with militaries embracing a technology that they say makes war safer and more effective. But human rights campaigners fear what might be to come - fully autonomous weapons that could select and engage targets without human intervention - and they want a new global treaty to stop that from happening.

An Iron Dome launcher fires an interceptor rocket near the southern town of Sderot November 15, 2012.
An Iron Dome launcher fires an interceptor rocket near the southern town of Sderot November 15, 2012.
In Israel, the country's missile defense system, the Iron Dome, autonomously senses the threat of an incoming rocket and sends a warning to an operator, who then gives the command to fire a missile.

During the recent cross-border violence between Israel and Gaza, Israeli officials said the defense system had an 80-90 percent success rate.

Unmanned Aerial Vehicles, or drones, have also played a growing military role, especially in U.S. campaigns.

They provide surveillance, identify targets, and can deliver lethal force - but only if an operator gives the go ahead.

But David Mepham, the United Kingdom director of Human Rights Watch, said within decades technological advances could write the human operator out of the equation.

"Drones are not fully autonomous weapons," Mepham said. "They involve human intervention in terms of their targeting and the decision to strike, but that has been an increasing trend in the way Western militaries, in particular, have been going in recent years. This will be several technological steps beyond that. It will be a weapons system that takes the human beings out of the loop."

Human Rights Watch has jointly published a report with the Harvard Law School's International Rights Clinic arguing that within 30 years militaries could be armed with autonomous "killer robots."

They said such weapons would be inconsistent with international humanitarian law and would increase the risk to civilians during armed conflict.

In order to prevent a move in that direction, the campaigners are pushing for a global deal that would prevent the use of such weapons, similar to agreements banning the use of landmines and cluster bombs.

"One of the things that holds us back from barbarism in contexts of war is this distinction between combatants and civilians," said Mepham. "And we are worried about a robotic weapon of the future not being able to tell the difference between a child holding out an ice cream and someone holding a weapon."

The U.S. and other militaries have said they have no plans to remove human supervision over the decision to use lethal force, despite advances in technology.

But Britain-based independent security analyst Hugo Rosemont said there should be a public discussion around the future use of autonomous technology, and not only with regard to its military potential.

"There also needs to be a public discussion around some of the wider applications, such as in the use of disaster management and humanitarian relief," said Rosemont. "These technologies can be deployed and have been in those circumstances, and that should be part of the wider discussion in what we think of as autonomy."

He said robots could well do the world plenty of good in the years to come. France sent remote controlled robots to Japan to help contain the Fukushima nuclear disaster last year - just one job better left to machines.
XS
SM
MD
LG