- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Human Rights Groups Call for Ban on Killer Robots

© AP PhotoVisitors walk past a model of Israeli weapon company Rafael' C-Dome is presented at the Euronaval show, in Le Bourget, north of Paris.
Visitors walk past a model of Israeli weapon company Rafael' C-Dome is presented at the Euronaval show, in Le Bourget, north of Paris. - Sputnik International
Subscribe
Human rights groups are calling for a total ban on so-called ‘killer robots' which arms manufacturers are working on to replace soldiers with autonomous systems that could engage and target enemies without human intervention.

Human rights groups believe that the use of killer robots raises major moral and legal issues, since Lethal Autonomous Weapons Systems (LAWS) require no human intervention. Campaigners are canvassing a United Nations expert meeting on LAWS in Geneva on Tuesday.

Although fully autonomous weapons do not yet exist, technology is moving in their direction, and precursors already exist, such as the Israeli Iron Dome and the US Phalanx and C-RAM, which are weapons systems programmed to respond automatically to threats from incoming fire.

A Human Rights Watch report concluded: "Many people question whether the decision to kill a human being should be left to a machine. There are also grave doubts that fully autonomous weapons would ever be able to replicate human judgment and comply with the legal requirement to distinguish civilian from military targets.

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force."

Human Rights at Risk

Speaking from the Geneva conference, Thomas Nash, from UK campaign group Article 36 told Sputnik:

"The act of taking a human life requires an understanding of the value of human life. That is at the very heart of what we're discussing here. That's the moral basis. The idea that you could have a machine that is programmed to undertake the act of firing a missile or dropping a bomb is morally repugnant."

However, the British Foreign Office said it saw no need for the prohibition of LAWS.

A spokesman told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area.

"The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control."

General Atomics MQ-9 Reaper - Sputnik International
Watchdogs Urge to Ban ‘Killer Robots’

"As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."

Thomas Nash told Sputnik: "I think it's pretty short-sighted to think that human control is sufficient, by having a human being that's programmed the system and then having a human being that deploys the system.

"We're talking about systems that could roam far and wide. Aircraft that could spend hours and hours in the air selecting their own targets and then firing upon those targets based on pre-programmed parameters for target selection. I don't think that constitutes meaningful human control. And I think most of the governments here at the conference in Geneva would agree with us."

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала