Rich Walker, the head of Shadow Robot Company, warned of catastrophic consequences for mankind if artificial intelligence is used in military conflicts. In an interview with the Daily Express, he claimed that organisations that are studying the use of autonomous weapons systems in conflicts say that using AI in these circumstances is a bad idea. "The positives of doing it are so small compared to the huge stack of negatives, so really we shouldn’t be doing this”, Walker cited the organisations as saying.
He said that the international community should treat this technology the same way it treats other dangerous technologies that people resort to in desperate situations. “We don’t allow people to use chemical weapons in war, we don’t allow people to use biological weapons and the use of land mines and cluster mines is heavily regulated or controlled”, Walker explained.
He confessed that he is against the American model of testing weapons, which he described as “let us see what goes wrong and what we can do about this". "Overall with AI and warfare, we can see some of the things that could go wrong so let us stop them from ever happening if we can”, Walker said.
In September, Microsoft President Brad Smith said that the world needs a global convention on the use of autonomous weapons systems and stressed that robots should “not be allowed to decide on their own to engage in combat and who to kill".