- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Deepfakes Could Be Used as Pretext for Wars, Tech CEO Says

© Photo : YouTube/derpfakesTrump | Deepfakes Replacement
Trump | Deepfakes Replacement - Sputnik International
Subscribe
Celebrities and top politicians have repeatedly been targeted by deepfake creators either to demonstrate the progress that artificial intelligence technologies are making, cast a shadow on their image, or just have fun. While deepfakes are still easy to spot, this is about to change drastically, posing new threats to humankind, experts warn.

Deepfakes could be weaponised to trigger international conflicts as “a pretext to strike first and go to war", the CEO of video verification company Amber, Shamir Allibhai, told The Daily Star, against the backdrop of US President Donald Trump’s rhetoric towards Iran and news from North Korea about nuclear tests.

“Countries may even manufacture 'evidence' with deepfakes as a pretext to strike first and go to war. Or a third country creates and distributes deepfakes to provoke conflict between two of its enemies", he argued.

The tech expert suggested that if various countries’ militaries do not come up with tools to verify future sophisticated fakes, including videos with national leaders’ digital doppelgangers, real wars could break out.

"Imagine that a President declares war on a foreign country. The foreign country sees the declaration but is the recording real? Fake? Should the foreign country pre-emptively strike in the event it is real? Or did the foreign country actually create the fake video as a pretext for war and to justify it to their citizens and the international community?” he theorised.

He pointed to Colin Powell’s presentation at the UN on Iraq’s alleged weapons of mass destruction programme, which was not proven to exist, and noted that countries have to be more sceptical, as the time when a video could be trusted has passed. According to him, editing both audio and video is “almost as easy as editing text in Word”, so one will soon be able to make a person say or do anything on video. He noted that some states are already not strangers to using fake imagery for propaganda “in an attempt to boost their people’s morale or to sow fear amongst their enemies”, without naming names.

Top politicians have repeatedly become the targets of deepfakes for demonstration purposes. Recently, MIT Technology Review Editor-in-Chief Gideon Lichfield transformed himself into Russian President Vladimir Putin in order to demonstrate new real-time deepfake technology. The journalist turned the presentation into an improvised interview, in which he played both parties. However, the deepfake, who spoke in Russian with a thick American accent, could hardly be called a very realistic forgery of Russia’s president, even though he bore a certain resemblance with Vladimir Putin.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала