17:10 GMT +306 December 2019
Listen Live
    Flag of the Islamic State in the conflict zone

    YouTube Using Machine Learning to Speed Removal of Extremist Content

    © Sputnik / Andrey Stenin
    Tech
    Get short URL
    213
    Subscribe

    YouTube is using a combination of new technology and an increase in human watchdogs to combat terrorist content on its site.

    The Google-owned video site ramped up its efforts approximately two months after UK Prime Minister Theresa May called for firms to take a proactive stance on removing online extremist propaganda, after a terrorist attack on London Bridge killed eight people and injured 48.

    According to CNET, YouTube is using cutting-edge machine learning technology to identify and remove more than 75 percent of controversial content before users even have a chance to flag it for inappropriateness. The improved accuracy of machine technology has more than doubled the number of videos YouTube takes down from its site, despite the large amount of content being published every minute.

    The video site has expanded its counter-terrorism efforts by automatically playing videos that expose extremist recruiting myths when users search specific keywords on the site.

    YouTube is also introducing a feature in the coming weeks that places videos with "inflammatory religious or supremacist content" in a "limited state" behind a warning. These videos will not qualify for comments and endorsements or be monetized or recommended.

    The video site is also increasing the number of independent experts working in its Trusted Flagger program, which is a network of groups and individuals responsible for reporting videos that may violate the company's guidelines. YouTube is currently working with more than 15 new organizations, such as the No Hate Speech Movement, the Institute for Strategic Dialogue and the Anti-Defamation League.

    Kent Walker, a senior vice-president at Google, said in a blogpost, "Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services."

    Related:

    US Military Hackers Join Online Anti-Terror War, Drop 'Cyberbombs' on Daesh
    Google Reviews Advertising Policies Amid Extremist YouTube Videos Scandal
    Google Exec Proposes Algorithms to Weed Out Online Terrorism, Hate Speech
    Russian Jailed for ‘Justifying Terrorism’ Online – Prosecutors
    Expert on Cyber Terrorism Reveals to Sputnik How to Protect Your Online Data
    Tags:
    online media, terrorism, recruitment, Google, YouTube
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik