YouTube Sets New Policies to Curb Extremist Videos

By Christine Zosche 

YouTube has struggled for years with videos that promote offensive viewpoints but do not necessarily violate the company’s guidelines for removal. Now it is taking a new approach: Bury them. (NYT)

Google will increase its use of technology to identify extremist and terrorism-related videos across its sites, which include YouTube, and will boost the number of people who screen for terrorism-related content, Google’s general counsel Kent Walker wrote in an editorial in the Financial Times Sunday. The company will also be more aggressive in putting warnings on and limiting the reach of content that, while not officially forbidden, is still inflammatory. (Bloomberg)

Currently, YouTube uses a combination of video analysis software and human content flaggers to find and delete videos that break its community guidelines. The first step, Walker wrote, is to devote more resources “to apply our most advanced machine learning research” to the software, which means applying artificial intelligence to the software that will be able to learn over time what content breaks these guidelines. The second step is to increase the number of “independent experts in YouTube’s Trusted Flagger Program,” which is composed of users who report inappropriate content directly to the company. (WaPo)

Advertisement

Advertisement