Warfare Will Be Revolutionized: UN Debates Autonomous Weapons, Many Call for Ban

Subscribe
The United Nations has convened a week-long meeting on lethal autonomous weapons systems, as a number of groups and academics call for an international ban on such provisions.

Talks on lethal autonomous weapons systems began at the United Nations November 13, amid calls for an international ban on independent "killer robots" that could revolutionize warfare — discussions are scheduled to last all week, under the banner of the Convention on Certain Conventional Weapons.

​The summit in Geneva comes after over 100 major figures in technology and science co-signed a letter warning such weapons systems could lead to a "third revolution in warfare" in July.

"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. The deadly consequence of this is machines — not people — will determine who lives and dies," the letter read.

Signatories included Steve Wozniak, Stephen Hawking, Elon Musk and Noam Chomsky.

While there is presently no international consensus on what constitutes a lethal autonomous weapon system, they are often defined as systems capable of targeting or firing without meaningful human control, functioning independently via artificial intelligence and machine learning.

​Many modern weapons, including drones, precision-guided munitions and defense batteries, already have varying levels of autonomy, but remain dependent on various degrees of human control — although the average citizen may underestimate the extent of automation and computerization of warfare in the present day. The distance between humans and battlefields is already significant in some cases, and is growing constantly.

In this March 7, 2007, file photo, Israeli Air Force officers stand next to an Israeli army Heron TP drone, also known locally as the Eitan, during a display at the Palmahim Air Force Base - Sputnik International
Israel's Heron TP Armed Drone May Not Find Its Way to Indian Skies

Look Mum, No Hands

Nonetheless, lethal autonomous weapons systems do not literally exist today — or at least, have not been revealed to the public yet. Perhaps the closest any military has to such a device is Israel's Harpy anti-radar drone — after launch, it flies over an area to find radar fitting predetermined criteria, then strikes.

South Korea has also developed a sentry gun system to guard its border with North Korea, which includes surveillance sensors and tracking as well as automatic firing, and can be made completely autonomous — although it can only currently engage with human approval.

​"If this trend towards autonomy continues, the fear is humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all. The US and others state lethal autonomous weapon systems "do not exist" and do not encompass remotely piloted drones, precision-guided munitions, or defensive systems," the Campaign to Stop Killer Robots said in a statement.

The NGO added that while the capabilities of future technology are uncertain, there are "strong reasons" to believe fully autonomous weapons could never replicate the full range of inherently human characteristics necessary to comply with international humanitarian law. Moreover, it claimed existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harm fully autonomous weapons could cause.

Gone Awry

Several countries, including the US, Russia, China and Israel are all researching and/or developing lethal autonomous weapons systems — and the implications of their use are not entirely negative. Technologically advanced sensors and guidance systems could improve targeting, and produce fewer unintended casualties compared to traditional bombing systems.

​Retired US Colonel Brian Hall, autonomy program analyst at the Joint Chiefs of Staff, praised the technology in July, arguing it would positively augment human decision-making, not replace it.

"The role of humans will aggregate at the higher cognitive processes such as operational planning and analysis. Increased automation or autonomy can have many advantages, including increased safety and reliability, improved reaction time and performance, reduced personnel burden with associated cost savings, and the ability to continue operations in communications-degraded or —denied environments," he wrote.

However, Hall acknowledged the pace of advance in science and technology meant the weapons capability of autonomous weapons in the future would be difficult to predict, and required legal and policy changes. 

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала