06:23 GMT30 November 2020
Listen Live
    Society
    Get short URL
    by
    0 11
    Subscribe

    The actual number of women who were targeted by the deepfake bot may exceed 104,000, as the company that uncovered its activities was only able to count images that were shared publicly.

    Numerous images of women ended up being transformed into deepfake nudes via a bot that operates on the popular messaging app Telegram, Wired UK reports, citing new research by "deepfake detection company" Sensity.

    According to the media outlet, at least 104,000 women were targeted by the bot since July, with the AI "removing" clothes from non-nude photos to produce naked deepfake pictures.

    The bot reportedly sends galleries featuring such doctored images on a daily basis to an associated Telegram channel that has nearly 25,000 subscribers, with such picture sets often being viewed over 3,000 times; there is also a separate Telegram channel promoting the bot that has over 50,000 subscribers.

    "It is maybe the first time that we are seeing these at a massive scale," said Giorgio Patrini, Sensity's CEO and chief scientist.

    The media outlet also points out that the actual number of women who got targeted by the bot may be higher than 104,000, as the program provides the option to "generate photos privately" while Sensity was only able to count those images that got shared publicly.

    "Most of the interest for the attack is on private individuals," Patrini remarked. "The very large majority of those are for people that we cannot even recognise."

    Earlier this year, Sensity also reportedly warned that numerous "explicit deepfake videos" featuring "female celebrities, actresses, and musicians" are being posted on porn websites on a seemingly regular basis, with up to 1,000 deepfake videos being uploaded every month to porn websites in 2020 alone.

    Tags:
    artificial intelligence, nude photos, Deepfakes, Telegram
    Community standardsDiscussion