Registration was successful!
Please follow the link from the email sent to

Partially Autonomous Missile Defense Systems Need Threats Assessment - NGOs

© Flickr / The U.S. ArmyA ground-based missile interceptor is lowered into its missile silo
A ground-based missile interceptor is lowered into its missile silo - Sputnik International
Subscribe
Speakers at this week's UN conference on Lethal Autonomous Weapons Systems claim that states that deploy defense systems with a high degree of autonomy should undertake accurate assessment of possible dangers and ensure meaningful human control.

MOSCOW (Sputnik) — States that deploy defense systems with a high degree of autonomy should undertake accurate assessment of possible dangers and ensure meaningful human control, speakers at this week's UN conference on Lethal Autonomous Weapons Systems (LAWS) told Sputnik.

Israel's Iron Dome and the US Phalanx and C-Ram are defense systems with a high degree of autonomy, programmed to respond automatically to incoming threats.

Although such systems are different from fully autonomous weapons, which are the main subject of the discussion in Geneva, they also raise concerns over possible dangers for humans.

"We believe that states should make an assessment of existing systems that have a high degree of autonomy in order to articulate why they do not pose the same dangers that killer robots might," Steve Goose, Director of the Arms Division at the Human Rights Watch (HRW) and co-founder of the Campaign to Stop Killer Robots said.

People look a mock killer robot in central London - Sputnik International
UN Conference on 'Killer Robots' in Favour of Ban on Production - NGO
LAWS, also known as killer robots, can be vulnerable to jamming, cyberattacks and other failures, while allowing a machine to select targets and carry out an attack is incompatible with ethical principles and human ability to value life, according to campaigners.

"Such systems that detect incoming missiles are a bit different from systems that would be able themselves to select targets to attack from a range of possible targets. We think states with those systems should explain how they ensure meaningful human control over every individual attack," Thomas Nash, director of Article 36 that works to prevent "unnecessary harm," told Sputnik.

One of the concerns about defense systems with high degree of autonomy is that such weapons could be extended for use against humans, Noel Sharkey, chairman of the International Committee for Robot Arms Control (ICRAC), explained.

More than 40 analysts and campaigners from a dozen of countries convened in Geneva on Monday to discuss LAWS and work out technical and legal aspects to the production of autonomous weapons.

Campaigners are calling for a preemptive ban on fully autonomous weapons. However, the United States, the United Kingdom, France and Israel appear to be opposing the ban, claiming it is not necessary because existing humanitarian law is enough to deal with the issue.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала