Not-So-Deep Fakes: New AI-Powered App Creates Realistic Nude Photos of Women for $50

© Photo : Mike Sington ‏/twitterA screenshot of the deepfake app called DeepNude made by Vice contributors
A screenshot of the deepfake app called DeepNude made by Vice contributors - Sputnik International
Subscribe
In 2019, you don’t have to be just afraid of hackers who may leak your closely guarded private photos. A newly developed artificial neural network only needs to be fed a normal picture to replace your clothes with what's under them.

An anonymous ‘technology enthusiast’ has created an app which is able to undress a fully clothed person within a couple of clicks, triggering concerns over the ethics of such technology and non-consensual photo sharing.

The app in question is called DeepNude – a play on the new term deepfake – an AI-assisted technology that superimposes faces onto other bodies in both photos and videos, potentially allowing users to create revenge porn, for example.

The website where one can download the app was launched in late March, according to the Twitter account of the enigmatic creator. The app is available in two versions, premium and free.

According to a report by Vice, one does not have to be tech-savvy to use this sexually oriented, automated version of Photoshop. Curious users only have to upload the photo they want to “undress”, click on the button and wait some 30 second until the AI processes it.

DeepNude appears to work only with high-resolution photos of women facing the camera directly, swapping their clothes for a nude body, regardless of what they are wearing. Tests with images of men, cartoon characters or people photographed from unnatural angles and/or in low resolution or poor lighting conditions are said to produce varying results, from somewhat comical to outright disgusting.

Furthermore, according to screenshots taken by Vice contributors, the free version produces an image with a huge dartboard-like watermark. If upgraded to premium for $50, DeepNude is able to produce uncensored photos with a clearly visible ‘Fake’ inscription.

The app currently runs on Windows 10 and Linux, while a Mac version is in the works.

The creator, who requested to be identified as Alberto, told Vice that his brainchild uses an algorithm called the Conditional Adversarial Network (cGAN), which was trained to generate naked images after learning from a dataset of more than 10,000 nude photos of women.

"I'm not a voyeur, I'm a technology enthusiast,” he was quoted as saying. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That's why I created DeepNude."

"This is absolutely terrifying," said Katelyn Bowden, founder and CEO of non-profit Badass, which fights revenge porn and image abuse. "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public."

Danielle Citron, professor of law at the University of Maryland Francis King Carey School of Law, called the technology an “invasion of sexual privacy.”

But the creator insisted that DeepNude doesn't make much of a difference and was no more harmful than Photoshop, which effectively just takes more time to achieve the same results.

Shortly after the initial report in Vice, DeepNude’s Twitter said the yet-unstable website was down due to an unexpected increase in traffic. It was unavailable as of the time of writing.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала