- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Security Pro Warns of Dangers to Society as 'Deepfake' Porn Becomes as Easy as an Instafilter

© AP Photoa man takes a picture of a woman at the AVN Adult Entertainment Expo in Las Vegas.
a man takes a picture of a woman at the AVN Adult Entertainment Expo in Las Vegas. - Sputnik International
Subscribe
Along with some perks, including creating girls with Elon Musk’s face or watching Bruce Lee as Neo in “The Matrix”, the AI-based technique allowing users to superimpose images onto other faces via machine learning has raised alarms, as it has become clear that your face can easily be swapped for that of a porn actor.

The CEO of a video verification company, Shamir Allibhai, told The Daily Star that it's only a matter of time until anybody can ingeniously be face-swapped into so-called deepfake porn. While celebrities have been already targeted with such videos streamed online, the tool is only expected to become more accessible. According to Allibhai, women “will be the primary target of the weaponisation of this deepfake technology.”

"The havoc is two-fold. At a primary level, relationships will be broken, people will be blackmailed. On a deeper level, society will become cynical if we don’t have video veracity solutions in effect. We will evolve to become distrusting and view everything with skepticism,” the company expert explained.

Allibhai suggested that AI-based deep fakes using pornography will chip away at the “trust among citizens, a foundation of democracies," predicting that existing audio and video clips, such as CCTV, voice recorders, police body cams, and bystanders’ cell phones, can be doctored. According to him, the public will be hit with deep fake audio first.

"It is also much easier to create believable fake audio than it is to create believable fake video. In the future, video will be generated from scratch, with no basis in actual footage," he said.

Earlier this summer, a deepfake video of US President Donald Trump stunned people attending the Copenhagen Democracy Summit 2019. During a panel event featuring Facebook’s Head of Global Affairs Nick Clegg, a 'Skype call' from POTUS, which turned out to be a deepfake video, was played to the audience.

"No one loves democracy as much as I do - that’s why God elected me", the caller said, with Boston Globe columnist Indira Lakshmanan revealing that the video was “made to show the danger” posed by AI and disinformation to elections and politics.

Scientists are working to at least protect public figures from such stunts. Earlier this month, MIT Technology Review revealed that a team of researchers in the United States developed a digital forensics technique to protect world leaders and celebrities from deepfake videos of themselves.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала