DeepFake App That ‘Undresses’ Women Closed Down Hours After Going Viral

© Photo : Mike Sington/twitter/screenshotA collage of screenshots of the deepfake app called DeepNude
A collage of screenshots of the deepfake app called DeepNude - Sputnik International
Subscribe
The AI-generated deepfake technology used in the app has recently become popular for creating fake videos (often explicit), where the original actors' faces are replaced with somebody else's. The newly-developed software went on to explore this frontier of technology, turning women’s photos into nudes within seconds.

An application which became instantly praised and loathed on Thursday for its ability to ‘undress’ a person on a photo has been taken down, with its creators saying they can ill-afford for people to misuse their brainchild.

The app, called DeepNude, had taken the art of Photoshopping to a whole different level by using a machine-learning algorithm to remove the clothes from a person in an image.

It appeared to work best with images of women, since the AI neural network had learned from a huge dataset of nude female photos.

The free version created fake nudes with a large watermark across the photo, but resolute voyeurs could pay $50 to upgrade to the premium version that created uncensored images.

“We created this project for user’s [sic] entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner. Honestly the app is not that great, it only works with particular photos," the creators said in a statement.

“We never though it would become viral and we would not be able to control the traffic. We greatly underestimated the request.”

An array of media reports about DeepNude has seen its popularity skyrocket, but the developers appeared not ready for a sudden increase in traffic to their website, as the app's Twitter account reported a number of crash alerts on Thursday.

But it’s not the technical problems that prompted them to give up on their idea.

They said: “Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way."

“Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it. Downloading the software from other sources or sharing it by any other means would be against the terms of our website.”

“The world is not yet ready for DeepNude.”

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала