YouTube is tweaking its recommendations to reduce exposure to videos on what it called “borderline content and content that could misinform users in harmful ways.”
Over 200 million videos are recommended on YouTube’s homepage every day, the company said in a blog post.
The Google-owned video site offered as examples of content that could be affected videos that promote miracle cures, claim the earth is flat or make “blatantly false claims” about historic events such as the Sept. 11 terrorist attacks.
Fewer than 1 percent of videos on its platform will be affected, YouTube said, adding, “We believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community … We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”
YouTube said the change only affects whether a video is recommended, and not whether it is available for viewing via its platform, adding that all videos that comply with its community guidelines can still be accessed.
The change will be implemented gradually to a small set of videos in the U.S., and YouTube said that as its systems—which rely on machine learning and real people—become more accurate, it will be expanded to more countries.
The company wrote, “You might remember that a few years ago, viewers were getting frustrated with clickbaity videos with misleading titles and descriptions (‘You won’t believe what happens next!’). We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys and time well spent, all while recommending clickbait videos less often. More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.”