- Increasing use of technology to help identify extremist and terrorism-related videos.
- “Greatly” increasing the number of independent experts in YouTube’s Trusted Flagger program.
- Taking a tougher stance on videos that may not clearly violate YouTube’s policies.
- Expanding YouTube’s role in counter-radicalization efforts.
YouTube updated its efforts on these fronts in a blog post, saying that it provided “large volumes of training examples” to its review teams, and that more than 83 percent of the videos it removed for violent extremism in the past month were pulled before they received a single flag, up 8 percentage points versus August.
The video site’s Trusted Flagger program of outside experts that advise on policy and help flag content is up to 35 non-government organizations, representing 20 countries.
YouTube also said it is researching expansion of Jigsaw’s Redirect Method—under which users who search for certain keywords on the site are shown playlists of videos that debunk violent extremist recruiting narratives—to apply it to new languages and terms.
The YouTube Creators for Change program is also being “heavily invested” in, with chapters being added in Israel and Spain.
In the Google-owned video site’s last update in August, it said that it:
- Began developing and implementing “cutting-edge machine learning technology” to help identify and remove violent extremism and terrorism-related content
- Started working with “more than 15” additional NGOs and institutions via its Trusted Flagger program.
- Began placing videos that don’t violate its policies but “contain controversial religious or supremacist content” in a “limited state,” meaning that they will remain on YouTube behind an interstitial, and they will not be recommended, monetized or feature comments, suggested videos and likes.
- Began rolling out features from Jigsaw’s Redirect Method.