- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Mark Zuckerberg Warns That Facebook’s Ability to Moderate Content Impacted by Covid-19

Subscribe
Covid-19 has led Facebook to limit its use of human moderators, with the company directing them to focus on initial reviews of the most severe content violations reported by its users. As a result, Facebook has relied less on human moderators to look at appeals involving other types of content.

Facebook CEO Mark Zuckerberg said on Tuesday that the Covid-19 pandemic has reduced the company’s ability to moderate content on its social networks due to its limited use of human moderators, according to CNBC.

Since lockdown measures started in mid-March, Facebook and its partners sent content moderators home to keep them safe, said Zuckerberg.

Content moderators are responsible for removing pornography, terrorism, hate speech and other unwanted content from across the site. Facebook employs 15,000 of these contractors at 20 sites globally, as cited in the Washington Post

“Our effectiveness has certainly been impacted by having less human review during Covid-19, and we do unfortunately expect to make more mistakes until we're able to ramp everything back up,” said Zuckerberg.

As a result of this limitation, Facebook made the decision to prioritize the use of human moderators to do initial reviews of the most severe content violations reported by its users. 

Zuckerberg said he expects the amount of appealed content to be much lower in the company’s August report.

Already, that drop on content appeal reviews can be seen in the Tuesday report. Content appeal reviews for January through March came in at 2.3 million pieces of content, down nearly 18% from content appeal reviews between October and December 2019 and down nearly 26% from January through March 2019.

According to the Washington Post, while Facebook, YouTube, Twitter and other companies have long touted artificial intelligence and algorithms as the future of policing problematic content, they’ve more recently acknowledged that humans are the most important line of defense.

Facebook is now in the process of bringing its content moderator contractors back online to help with the review of content, and the majority of those reviewers can now work from home, said Guy Rosen, Facebook vice president of integrity, as reported by CNBC.

“There’s obviously differences in what that work is like, so we’re working hard to make sure that we’re prioritizing things the right way,” Rosen said.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала