Listen Live
    Woman

    Non-Consensual Life-Shattering ‘Deepfake’ Porn Should Be Criminalised, Study Argues

    CC0
    Society
    Get short URL
    124
    Subscribe

    Per research conducted as part of a project on image-based sexual abuse in Australia, New Zealand and the UK, the production and sharing of fake porn images is no less harmful – and destructive- than physical sexual abuse, and the cases should be specially overhauled and formally made punishable.

    In a new report on image-based sexual misconduct, researchers argue, as cited by The Telegraph, that the issue has taken a more worrisome twist due to cutting-edge modern technology, and most specifically, “deepfake” techniques – ones meant for human image synthesis that exploit machine learning.

    “To the untrained eye, it is very difficult to tell the difference between the fake and real images, and so the harm and harassment felt by victim-survivors is just as significant”, the report by Professor Clare McGlynn’s team, which is to be presented to British MPs on Monday, states.

    According to the researcher, in light of the serious legal and policy repercussions described in the report, “we are effectively gambling with people’s lives”. McGlynn went on to specify their findings, saying they concluded that image-based sexual abuse can “shatter lives, often experienced as an entire ‘social rupture’ of their world”.

    McGlynn has denounced “out-of-date and piecemeal laws”, arguing they should be reconsidered to ensure measures are taken to “criminalise the paralysing and life-threatening impact of threats, and recognising the significant harms of fake porn”. She specially noted that all forms of non-consensual taking or sharing of sexual images, including edited ones, should be punishable, adding that although internet providers undertake action to remove sensitive and harmful images, the procedure is often too slow and complicated.

    In late June, in a move commonly seen as deeply unethical, an anonymous programmer created a brand-new app called DeepNude that uses AI to create nonconsensual porn, namely turn a clothed woman in a picture into a nude one.

    Creating fake porn and “deepfake” images are closely associated with the revenge porn issue, which consists of sharing private images, not necessarily altered ones, without a victim’s consent. The latter has become a widespread offence in Britain and the rest of Europe over the past few years, but largely falls under communications legislation, meaning victims are not automatically granted anonymity like in sexual offence cases, for instance.

    Meanwhile, upskirting cases – undercover photographing underneath someone’s clothing - are separate offences, which were criminalised in England and Wales in April 2019 following a high-profile social campaign and is currently punishable by up to two years’ imprisonment.

    Related:

    Facebook Set to Combat ‘Revenge Porn’ Through New AI Technology
    'It Puts Revenge Porn in the Same Category as Perhaps More Familiar Situations' - Prof
    FBI Reviews Old Domestic Abuse Claims Against Shanahan as Trump Moves to Replace Him With Esper
    Twitter to Flag Abusive Tweets From World Leaders That Violate Platform’s Rules
    Tags:
    research, criminal case, criminal, porn, sexual abuse, upskirting
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik