YouTube Hate Speech Ban Gives Company Power to Decide What Can & Cannot be Seen – Scholars

CC0 / / YouTube
YouTube - Sputnik International
Subscribe
Only last week YouTube announced it would be deleting material from its platform that glorifies Nazis in an attempt to stop people from becoming radicalised, but its actions are raising questions about the company's algorithms when it comes to choosing which videos will be removed.

While the move has generally been welcomed because it bans supremacist content, at the same time the website has received complaints from teachers in the UK after their educational videos on Nazism and World War II, including historic archival videos, have now been banned from the platform.

Richard Jones-Nerzic, a history teacher had his channel censored after uploading old documentaries about the rise of Nazism and is concerned over the "indiscriminate nature of the actions" of YouTube.

"There was no way that a human reviewer could watch my videos and claim they were inciting hatred etc. The two films that were removed from my channel were extracts from BBC schools documentaries, designed for students aged 13-16. YouTube's actions will obviously impact the teaching and learning community", said Jones-Nerzic.

The teacher also explained that these clips are very important as they are used by teachers in class while students need to be able to access them to do their studies.

​YouTube said in its defence that users should make clear the contextual usage of a video, but as Jones-Nerzic pointed out, in his case this was “ridiculous” because his site is called "international school history".

YouTube said on Wednesday that they were taking another step in regulating their hate speech policy by "prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory".

Kevin Curran, professor of cyber security at the Department of Computing, Engineering & Built Environment at Ulster University said that the Google-owned platform has a very difficult task in determining which videos may be considered extreme and which historical:

"It's a very difficult problem, you’re talking about one of the richest companies in the world and they need to get it right, and they’re struggling. It’s very, very difficult. What can be done to help it is context, and anyone that uploads a video of maybe Nazis or Hitler should provide as much context about that, and of course these things can improve over time".

YouTube has been criticised for supremacist content on its site and its algorithms that encourages users to stay on the site and click on the next "recommended video" which is usually more extremist than the previous one, something Curran has also noted:

"One thing we have discovered is that the recommendation for the next video to watch, and the next video, and the next video, always tends to border radicalisation to the extreme. So if you watch enough videos and click the next recommended you will end up on extremist websites 9 times out of 10".

The video-sharing platform will have to use machine learning algorithms to go through all the content that's been uploaded and it will learn from its mistakes along the way the professor believes.

When videos are deemed extremist that's "where the human content moderators can come in and look at that video and say, 'Okay, that's educational, that's historical, that does not need to removed', so, therefore, yes they will improve over time, we're still in the early days", Curran said.

​Another major worry is that legitimate historical educational content is already being moderated incorrectly according to the cyber security professor:

"We don't want censorship which is affecting what has happened, what we should be discussing as a society, we don't want certain items to be erased from our history, but we do want anything which is really causing the hate. [...] 99 % of the population do not want that, so we want that to be erased", Curran stressed.

YouTube's CEO, Susan Wojcicki, speaking at a technology conference in Arizona this week said that the company has made around 30 changes in just the last 12 months with the latest being its policy on hate speech.

But when talking about radicalisation, Wojcicki stated that the site still offers "a diversity of opinions" and has made changes in terms of how it handles recommendations at the start of the year. YouTube picks out "borderline content" which is determined by raters from across the US and has now reduced the recommendations for "borderline content" by 50%.

"I think the combination of changes that we are making of our policies as well as the changes that we are making to our recommendations are going to make a really big difference", Wojcicki described the company's efforts.

While Jones-Nerzic says he has enormous sympathy for YouTube's attempts to clean up the videos he's "also concerned by the power this gives to an unaccountable, enormously powerful corporation to decide what can and cannot be seen", the teacher concluded.

Views and opinions, expressed in the article are those of the speakers and do not necessarily reflect those of Sputnik.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала