01:51 GMT +322 October 2019
Listen Live
    Breast

    Money for Nothing, Nudes for Free: Reddit User Allegedly Cracks DeepFake 'Women Undressing' App

    CC BY 2.0 / Caitlin Regan / Love Them.
    Viral
    Get short URL
    196
    Subscribe

    The author of the crack claimed that DeepNude does nothing novel, as it merely packages existing image manipulation libraries and a picture dataset into an interface. And while the programme was taken down by its creators, copies of it can still be found online.

    As creators of the DeepNude app, designed to “undress” people featured in photos, announced that they were taking their creation down, one inquisitive social media user has revealed that he managed to “crack” the programme, effectively providing people with free access to its premium version which normally costs $50 while it was on sale.

    While the free version of the programme created fake nudes with a large watermark across the photo, thus requiring those who wanted fully uncensored images to pay for a premium upgrade, the new crack allegedly allows one to enjoy all of the programme's functions free of charge.

    The redditor noted that he spent about 4 hours tinkering with the program, expecting it to be “full of malware”, but it turned out to be “mostly honest”.

    “It does send up a uuid from wmic, which he likely stores in a DB somewhere, but those aren't very identifiable really. (backend seems to be AWS Lambda). It also stores your registered email address inside the metadata of the "saved" image, presumably to track who is sharing the output pictures. that tracking will only effect you if you actually paid for a premium code, otherwise the value is an empty string”, he remarked.

    The redditor also described DeepNude as an underwhelming piece of software, arguing that while the programme deserved all the criticism it got, it does nothing novel, as it “only packages existing python ML/image manipulation libraries + a picture dataset into an interface”.

    Shortly after the netizen created a Reddit post detailing his exploits, the entire r/DeepNudes subreddit was banned from the platform for “a violation of Reddit’s content policy against involuntary pornography”.

    The app, which quickly went viral on 27 June upon its release, was taken down by its creators mere hours after launch, with developers claiming that the probability of people misusing their creating was simply too high to ignore.

    Related:

    'Deepfakes': Porn Created by Artificial Intelligence Targets Hollywood Stars
    Facebook Tests Own Policies With DEEPFAKE Video of Zuckerberg
    Tags:
    Crack, app, nude photos, photo, Deepfakes
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik