01:01 GMT21 September 2020
Listen Live
    Get short URL

    On Tuesday, Facebook announced that it will allow people to appeal decisions to remove questionable content on the social media platform.

    "For the first time we're giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we've made a mistake," Facebook Vice President of Global Policy Management Monika Bickert wrote in a Tuesday blog post. 

    "We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines — and the decisions we make — over time," the post adds.

    Facebook also describes its content policy team in the post, which consists of people who are subject matter experts on various issues including hate speech, child safety and terrorism. These experts collaborate with experts and organizations outside Facebook in order to "better understand different perspectives on safety and expression."

    "We know we need to do more," Bickert wrote. "That's why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence," she added.

    From this point forward, if Facebook deletes a post for any of the above reasons, the person who posted the content will be notified and given a chance to ask for a second review. 

    "This will lead to a review by our team (always by a person), typically within 24 hours," Facebook said in its post. "If we've made a mistake, we will notify you, and your post, photo or video will be restored."

    "We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up," Bickert said.

    "We believe giving people a voice in the process is another essential component of building a fair system," she added. 

    The subject of what nudity is acceptable on the site has been one of the most debated. In 2016, Facebook removed a Pulitzer-prize winning photo of a naked girl fleeing a napalm attack during a 1972 strike in Vietnam. The post, which was shared by Norwegian Prime Minister Erna Solberg, was removed on the grounds of nudity. Facebook changed its mind regarding the photo after widespread backlash about decision.

    The site has also been involved in multiple arguments involving photos of breastfeeding women. In 2015, Facebook changed its rules so that photos of women breastfeeding or women with post-mastectomy scarring are permitted on the site.

    Facebook has been in the spotlight ever since it was revealed in March that millions of personal users' information was shared with consultancy firm Cambridge Analytica. The company reportedly received information about users through a personality app developed by Alexander Kogan, a Cambridge University researcher, and went on to use it to help predict and influence US voters.

    This month, Facebook founder and CEO Mark Zuckerberg testified before Congress about the data breach.


    India Seeks Written Commitment From Facebook That It Won't Leak Voters' Data
    Manipulating User Info to Political Ends on Facebook, Google a 'Serious Problem'
    Regulator Asks Cambridge Analytica to Check for Russian Facebook Users’ Data
    Russia's Watchdog May Block Facebook If Network Fails to Comply With Laws
    Facebook’s Data Mining, Selling Is ‘a Business Model’, Not ‘One Bad Incident’
    regulation, policy change, policy, Facebook, world
    Community standardsDiscussion