YouTube Continues to Push 'White Supremacist' Videos, Study Claims

© AP Photo / Danny Moloshok This Oct. 21, 2015, file photo shows signage inside the YouTube Space LA offices in Los Angeles. YouTube’s inability to keep ads off unsavory videos is threatening to transform a rising star in Google’s digital family into a problem child. The key question is whether a recently launched ad boycott of YouTube turns out to be short-lived or the start of a long-term marketing shift away that undercuts Google’s growth, as well as Alphabet Inc., its corporate parent.
 This Oct. 21, 2015, file photo shows signage inside the YouTube Space LA offices in Los Angeles. YouTube’s inability to keep ads off unsavory videos is threatening to transform a rising star in Google’s digital family into a problem child. The key question is whether a recently launched ad boycott of YouTube turns out to be short-lived or the start of a long-term marketing shift away that undercuts Google’s growth, as well as Alphabet Inc., its corporate parent.  - Sputnik International, 1920, 12.02.2021
Subscribe
YouTube has previously been criticized for allowing extremist videos, including 'white supremacist' content, to exist on its platform. In the past, the platform has attempted to remove videos, containing extremist content.

A new report by the Anti-Defamation League (ADL), which aims to stop the defamation of Jews and non-Jews alike, advocating fair treatment for all, has revealed that even though YouTube has removed thousands of extremist videos, there still remains "white supremacist" content that circulates among subsets of users.

According to the study, which evaluated the viewing habits of 915 people, one in every 10 participants viewed at least one video from an extremist channel and two in 10 viewed a video from an “alternative” channel. Alternative channels are ones that “serve as gateways to more extreme forms of content,” the ADL reports in its study. 

The study also found that YouTube’s recommendation algorithm showed extremist content to those who had viewed similar content previously.

“Participants often received and sometimes followed recommendations for videos from alternative and extremist channels, especially on videos from those channels,” the study found. 

For example, users who viewed extremist content on YouTube were recommended other similar videos to watch almost 30% of the time. In addition, almost all of the 9% of users who watch a video from extremist channels also viewed videos on alternative channels as well.

The report also found identified 322 “alternative” channels and 290 “extremist or white supremacist” channels on YouTube. Among both alternative and extremist channels, 515 were still active as of the end of January. 

“Despite the recent changes that YouTube has made, our findings indicate that far too many people are still being exposed to extremist ideas on the platform,” Brendan Nyhan, an author on the report and professor of government at Dartmouth College, said in a statement.

In a separate statement, YouTube defended how it has dealt with extremist content.

“We have clear policies that prohibit hate speech and harassment on YouTube, and terminated over 235,000 channels in the last quarter for violating those policies,” the company said in a statement, the Hill reported.

“Beyond removing content, since 2019 we’ve also limited the reach of content that does not violate our policies but brushes up against the line, by making sure our systems are not widely recommending it to those not seeking it. We welcome more research on this front, but views this type of content gets from recommendations has dropped by over 70% in the US, and as other researchers have noted, our systems often point to authoritative content,” the company added.

ADL CEO Jonathan Greenblatt noted, however, that YouTube has not done enough to prevent the dissemination of extremist content.

“It is far too easy for individuals interested in extremist content to find what they are looking for on YouTube over and over again,” Greenblatt said in a statement obtained by the Hill. “Tech platforms including YouTube must take further action to ensure that extremist content is scrubbed from their platforms, and if they do not, then they should be held accountable when their systems, built to engage users, actually amplify dangerous content that leads to violence,” he added.
Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала