Published: Tue, December 05, 2017
Entertaiment | By Paul Elliott

YouTube to Expand Teams Reviewing Extremist Content

YouTube to Expand Teams Reviewing Extremist Content

However, unlike what has been claimed by Google, the bomb-making YouTube tutorial used by Abedi is available on Google networks again, despite being deleted in the wake of the terrorist attack, according to a report by another British newspaper the Sun. Posters whose videos are flagged by the software may be ineligible for generating ad revenue.

YouTube has been criticised in recent weeks about allowing offensive and disturbing content to be uploaded to its platform.

Now, YouTube chief Susan Wojcicki has explained how the platform plans to keep a closer eye on the videos it hosts going forward by applying the lessons it learned fighting violent extremism content. Wojcicki says the company has begun training its algorithms to improve child safety on the platform and to be better at detecting hate speech.

In an effort to tackle the issue, YouTube has developed software to identify videos linked to extremism.

According to Wojcicki, YouTube spent past year "testing new systems to combat emerging and evolving threats" and invested in "powerful new machine learning technology", and is now ready to employ this expertise to tackle "problematic content".


Since June, YouTube's enforcement teams have reviewed two million videos, of which 150,000 have been taken down, she said. In 2018, it'll start publishing reports containing data on the flags it gets, along with the the actions it takes to remove any video and comment that violates its policies.

The official added that the company was also fighting against aggression in comments, as well as cooperating with a number of child safety groups, such as the National Center for Missing and Exploited Children, in order to ensure fight against predatory behavior.

In a separate post on YouTube Creator Blog, Wojcicki also warned about a growing number of "bad actors" who share extremist content and disseminate videos "that masquerade as family-friendly content but are not".

This is helping to train the company's machine learning technology to identify similar videos, which is enabling staff to remove almost five times as many videos as they were previously, she said.

Like this: