- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Facebook Moderators Rely on Inaccurate, Outdated Docs to Check Content - Reports

Subscribe
Facebook’s thousands of content moderators reportedly rely on PowerPoint slides to determine what content to block or allow, with some slides including outdated or inaccurate information.

According to documents obtained by the New York Times, 1,400 pages are used to guide more than 7,500 moderators as they approve or reject content. However, not all of these pages are accurate and updated.

READ MORE: Facebook Suspends Page of Russiagate Accuser for “False Flag Operation”

Moderators were led to removing fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups. In Myanmar, a paperwork error allowed an extremist group, accused of fomenting genocide, to stay on the platform for months. In India, moderators were mistakenly told to take down comments critical of religion.

The company confirmed the authenticity of the documents, according to the Times, yet claimed that the rulebook is used for training. The employees, however, speaking to the paper on condition of anonymity, said that the rules actually apply during their daily routines. 

Facebook - Sputnik International
Disappointed Users Struggle to ‘Delete’ Facebook Despite Privacy Woes
The rulebook tries to cover all possible scenarios: one document sets out several rules just to determine when a word like “martyr” or “jihad” indicates speech in support of terrorism. Another describes when the discussion of a barred group should be forbidden. Yet the rules can be changed in favour of current political agendas. In June, according to internal emails reviewed by The New York Times, moderators were told to allow users to praise the Taliban if they mentioned its decision to enter into a cease-fire.

Another problem is that the guiding slides for Facebook moderators are written for English speakers relying on Google Translate, without getting into any specifics about local linguistic or cultural traditions. One moderator said there is a rule to approve any post if it's in a language that no one available can read.

Monika Bickert, Facebook’s head of global policy management, said that the primary goal was to prevent harm and that to a great extent, the company had been successful. 

“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Bickert said. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”

Facebook has been accused repeatedly throughout the year of having a lack of transparency in the processing of users’ data, including allowing third-party access to the personal data of tens of millions of people without their consent.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала