Susan Wojcicki, CEO of YouTube, said they will employ 10,000 moderators “across Google” in order to tackle content that violates the company guidelines.
According to Wojcicki, YouTube spent last year "testing new systems to combat emerging and evolving threats" and invested in "powerful new machine learning technology,” and is now ready to employ this expertise to tackle "problematic content."
"Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube," she said in a statement posted on YouTube’s official blog.
As the YouTube CEO explained, the company will continue to bolster its staff, which will be tasked with monitoring the video content and expanding "the network of academics, industry groups and subject matter experts" who can help YouTube "better understand emerging issues.""As challenges to our platform evolve and change, our enforcement methods must and will evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering," she declared.
In a separate post on YouTube Creator Blog, Wojcicki also warned about a growing number of "bad actors" who share extremist content and disseminate videos “that masquerade as family-friendly content but are not.”
In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation” while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should."
All comments
Show new comments (0)
in reply to(Show commentHide comment)