11:52 GMT19 January 2021
Listen Live
    Military & Intelligence
    Get short URL

    Since 2010, the evolution of drone technology has gone into hyperdrive – while once complete human control was obligatory, now the machines are equipped with ever-increasing levels of autonomy. While no fully self-directed system exists yet the technology is certainly being tested and developed in some countries.

    Militaries the world over are excited by the prospect of autonomous drones — but many leading figures in science and technology find the prospect deeply troubling, among them famed theoretical physicist Stephen Hawking, Tesla Motors and SpaceX founder Elon Musk, Microsoft founder Bill Gates and Apple cofounder Steve Wozniak, who have all urged a preemptive prohibition on the use of autonomous weapons or artificial intelligence in warfare.

    As yet, there is no universally agreed or legal definition of the term autonomous drone. Industry uses the "autonomy" label extensively, often for futuristic marketing purposes — in truth, there are no truly autonomous devices or machines in existence, capable of directing themselves entirely and switching themselves on and off. Human involvement of some degree is always required at some stage as it gives an impression of very modern and advanced technology.

    A control system for unmanned drones, at the Farnborough aerospace show
    © AP Photo / Lefteris Pitarakis
    A control system for unmanned drones, at the Farnborough aerospace show

    However, some countries do have individual definitions — for instance, the UK Ministry of Defense's 2011 paper, The UK Approach to Unmanned Aircraft Systems, describes autonomous vehicles as "capable of understanding higher level intent and direction." Any country that uses partially autonomous technology adds the prefix of "remotely piloted" to such equipment, underlining that they operate under the direct control of human beings.

    It's almost unquestionable however that "autonomous" one day will mean drone systems that can act independent, based on their own initiative. Such technology is already largely developed — but as far as is publicly known at least, no system is fully operational as of July 2017. What has limited full development, and indeed deployment, are major legal and ethical question marks hanging over whether lethal machines able operate without direct human control should be used, if not exist outright.

     US Predator unmanned drone armed with a missile
    US Predator unmanned drone armed with a missile

    One of the greatest challenges for the development and approval of autonomous drones is that it's extremely difficult to develop satisfactory validation systems, ensuring the technology is safe and acts like humans would in comparable scenarios. Truly sophisticated drones would necessitate programming for an almost innumerable amount of potential courses of action, requiring the use of machine learning and artificial intelligence to learn and develop modus operandi.

    Autonomous drones used in wars would be de facto subject to the general principles and rules of the Law of Armed Conflict, just as any other weapon, system or platforms — as such, they can only be directed at lawful targets (military objectives and combatants) and attacks must not cause excessive collateral damage. In attack decisions, military commanders must take all "feasible precautions" to "verify" that the attack is not directed at a protected person or object, and the attack is not expected to violate the principle of proportionality.

    However, many of the Law's restrictions and obligations are somewhat elastic, giving military commanders "wiggle room" to interpret and decide what certain principles mean in practice.

    Military vehicles carry Wing Loong drones, a Chinese made medium-altitude long-endurance unmanned aerial vehicle, past spectators during a parade commemorating the 70th anniversary of Japan's surrender during World War II held in front of Tiananmen Gate in Beijing, Thursday, Sept. 3, 2015
    © AP Photo / Ng Han Guan
    Military vehicles carry Wing Loong drones, a Chinese made medium-altitude long-endurance unmanned aerial vehicle, past spectators during a parade commemorating the 70th anniversary of Japan's surrender during World War II held in front of Tiananmen Gate in Beijing, Thursday, Sept. 3, 2015

    Autonomous drones will likely never be capable of reasoning, in the human sense of the term, as they lack consciousness — how would such a device ever be able to use its "discretion" to interpret the practical meaning of battle regulations, and how would it?

    Legal and ethical considerations may well be separate questions, given the two are rarely inextricably linked, and legislation is invariably several steps behind emergent technologies.

    Those who oppose autonomous weapons generally do so on the basis such systems would effectively be given decision making power in life-and-death situations, and machines should never be afforded sole responsibility for deciding to end a life. Making a "moral" judgment about who and what is a legitimate target and isn't on a battlefield is difficult enough for human beings — so fears that drones would make decisions that could result in greater civilian casualties and destruction are potentially well-founded.

    That this risk exists is unlikely to deter Western military powers, in particular the US, from pursuing the goal of fully autonomous drones — particularly as it is arguable that using robotic soldiers in war is perhaps morally preferable to human ones. Autonomous drones could complement human military activity by scouting dangerous areas and performing high-risk tasks such as bomb disposal, and lessen the risk of human casualties in the process.

    After all, super high tech autonomous drones would be able to process incoming sensory information at a greater rate and more rationally and effectively than human soldiers, and thus make better battlefield decisions. An autonomous drone that lacks human emotions such as fear, rage and vindictiveness could even potentially reduce the risk of war crimes.

    Clearly, autonomous drones raise important judicial and ethical issues about responsibility for unintentional harm. The technologies create some moral accountability gaps. When autonomous military systems are deployed, it becomes less clear how to apportion responsibility. And such potential responsibility gaps must be addressed properly through technical solutions and legal regulations. NATO and Allies should therefore engage in international discussions on these topics. At the same time, technological evolution will continue and an autonomous drone — no matter how technologically sophisticated it is designed — remains a product, a tool in the hands of humans. Our fundamental responsibility for war and how wars are fought can never be morally "outsourced," least of all to machines.

    It would almost certainly be preferable to develop strong legal and ethical frameworks governing autonomous weapon usage before they become a permanent fixture in theaters of war — particularly as it's impossible to know with any certainty when that point will come.


    Navy to Launch Autonomous, Swarming Drones (VIDEO)
    US Successfully Demonstrates Autonomous Military Drone ‘Swarms’ - DoD
    Russia Readies Futuristic Autonomous Combat Vehicle, Tiny Drone Tank for Battle
    'We Have a Problem': AI Still a Major Concern, Despite Scientific Assurances
    military drones, legal decisions, drone policy, drone defense, drone missions, armed drones, drone campaign, ethics, Stephen Hawking, Elon Musk, Steve Wozniak, Bill Gates, US, United Kingdom
    Community standardsDiscussion