Facebook Set to Combat ‘Revenge Porn’ Through New AI Technology

© REUTERS / Regis Duvignau/IllustrationA woman looks at the Facebook logo on an iPad in this photo illustration taken June 3, 2018
A woman looks at the Facebook logo on an iPad in this photo illustration taken June 3, 2018 - Sputnik International
Subscribe
For those who don’t know, revenge porn is typically understood as the distribution of sexually indecent images or videos of an individual without their consent or knowledge, and it has become an increasingly serious problem for social media users over the past few years.

Facebook is set to introduce a new Artificial Intelligence (AI) detection tool that will remove so-called ‘revenge porn' before it has even been reported, the social media giant announced over the weekend.

The purpose of the new function is to save victims of the unwanted posts the time and hassle of having to report them and of subsequently going through the sometimes time-consuming process of getting them removed.

The announcement marks Facebook's latest effort to expunge the platform of what it deems as malicious and abusive content.

​In a recent blog post, the corporation's Global Head of Safety, Antigone Davis, said that, "finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram."

Yet, rather concerningly to some, Davis also wrote that there will be a new "emergency option to provide a photo proactively to Facebook, so it never gets shared on our platforms in the first place." In other words, users will be able to send their photos to Facebook so that the company can feed them into the AI system that will then be able to recognise if that photo is uploaded onto the platform. 

 

​The AI tool will be trained using posts and photos that Facebook have previously removed as revenge porn.

Reportedly, the AI tool will be designed to hone in on "nearly nude" photo images — such as of someone in their underwear — that accompany text that appears to try and shame or humiliate the individual in the photo, which may suggest that the person who uploaded it done so because they are seeking some kind of revenge or trying to cause embarrassment. The photo will then be flagged up for second analysis by a human reviewer who will confirm whether or not the post qualifies for deletion. Moreover, Facebook has said that in the majority of cases where this happens, the account that uploaded the content will be disabled.

READ MORE: Facebook Says Removed 1.5 Million Videos of NZ Mosque Shooting in First 24 Hours

On top of the AI function, Facebook has also said that it will be launching a new support centre for victims of revenge porn that will be called ‘Not Without My Consent,' where victims will be able to seek advice on how to cope after falling victim to revenge porn.

Facebook has pledged to increase its moderation efforts after receiving a barrage of criticism over the past few years for not acting quickly to remove offensive posts, which some users say have been allowed to linger for too long. 

Despite the advance in efforts to catch perpetrators of revenge porn and other such practices, Facebook has said that the technology will not be able to capture absolutely every instance, and that it will still need users to step up and report indecent images when they see them.

Over the past few years in the US, about 42 states have outlawed revenge porn as the number of posts containing non-consensual images and videos have spiked. For example, a law was passed in February 2019 in New York which says that victims of revenge porn are allowed to take perpetrators to court.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала