An international coalition is calling for a ban on fully autonomous weapons known as "killer robots." The 45-member Campaign to Stop Killer Robots says it wants the United Nations to draft an international treaty to outlaw the use of these robotic weapons.
The Campaign to Stop Killer Robots is taking its case to governments attending the annual meeting of the Convention on Conventional Weapons here in Geneva this week. The group of non-governmental organizations says it wants the U.N. gathering to agree to add fully autonomous weapons to the Convention's work program in 2014.
The fully autonomous weapons or "killer robots" have not yet been developed. Technology, however, is moving toward increasing autonomy. Such weapons would select and pull the trigger on targets without human intervention.
Noel Sharkey chairs the International Committee for Robot Arms Control and is a founding member of the Campaign to Stop Killer Robots. He says autonomous weapons should be banned outright.
"The big problem for me is that there are no robot systems that can discriminate between civilian targets and military targets unless they are very, very clearly marked in some way…so, the idea of having robots going out into the field and selecting their own targets is to me, is just horrifying. It cannot work, " said Sharkey.
Fully autonomous weapons do not yet exist. The activists, however, say several robotic systems with various degrees of autonomy and lethality are in use by Britain, Israel, the United States and South Korea. They say other nations, such as China and Russia, are believed to be moving toward these systems.
The director of the Arms Division at Human Rights Watch and a member of the campaign, Steve Goose, warns that killer robots will become a reality if governments do not act now to ban them. He says the technology and doctrine are headed toward greater autonomy on the battlefield.
While fewer and fewer soldiers are on the battlefield, he says many civilians remain. Goose says a line must be drawn on a weapons system that would be able to select and attack targets automatically.
He says this concept crosses a fundamental moral and ethical line.
"Armed robotic weapons systems should not make life and death decisions on the battlefield. There is simply something inherently wrong with that," said Goose. "So, they need to be banned on ethical grounds. We think they also need to be banned on legal grounds. If and when a killer robot commits a war crime, violates international humanitarian law…who would be held accountable, who would be responsible for that violation?"
Goose says in recent months, fully autonomous weapons have gone from an obscure issue to one that is commanding worldwide attention. He says that since May, 34 countries, including several that are developing these systems, have openly expressed concern about the dangers the weapons pose.
He notes that in 1995, the Convention on Conventional Weapons created a protocol to the treaty, which pre-emptively banned blinding lasers. Goose says he believes killer robots could become the second such weapon to be prohibited before it is ever used on the field.