YouTube is bringing its A game to tackle terrorism online

YouTube has often been blamed for not doing enough to curb some of the offensive content (mainly terrorist propaganda videos) that makes its way into the channel and spreads extremism online.

Now Alphabet, Google and YouTube’s parent company, has released a new set of rules, which they are going to follow to tackle this problem head on.

Here’s all you need to know.

Content: A dilemma of sorts

A dilemma of sorts

When it comes to content, things do get a little bit challenging for hosting platforms; a terrorist attack showcased as news by broadcasting channels gets morphed as a video propagating violence by a different user.

While the content remains the same, the context tends to differ, making it hard to take a call on what should be allowed and what should be taken down.

Problem: Measures to tackle the issue

YouTube has been using its video analysis models to deduce the videos that are being used in the wrong context and has been successful in removing 50% of it in the last six months.

From now on, they will up their game by applying their machine learning research to train new “content classifiers”, capable of sniffing out extremist and terrorism-related content more quickly.

Flagger: Getting more eyes on the platform

YouTube will also increase the number of independent experts it has on its Trusted Flagger program to make more precise bifurcation between propaganda and newsworthy speech.

Trusted Flagger reports tend to be 90% accurate and, therefore, this program is now being scaled up.

Operational grants will be provided to the 50 expert NGOs that will now join the 63 organizations engaged in this task.

Extremists: No monetary benefits on controversial videos

No monetary benefits on controversial videos

YouTube will also expand its reach among counter-extremist organizations to identify content, which is being used by terrorist organizations to recruit extremists.

Videos that don’t violate policies but do contain inflammatory religious or supremacist content will now appear behind an interstitial warning and they’ll no longer be monetized, recommended or eligible to allow comments and user endorsements.

With less engagement, they will be forgotten.

Hands: A unified stand against terrorism

YouTube will also expand the reach of its counter-radicalization measure by nurturing its ‘Creators for Change’ program that focuses on YouTube creators, who speak out against hate and radical practices.

Alphabet has also committed to work with other industry peers like Facebook, Microsoft, and Twitter to make an international forum, which would share and develop technology, support smaller firms and jointly tackle terrorism online.

 

 

Source: NewsBytes

Share Now