Mozilla Shares 28 User Stories of YouTube’s Recommendations Engine Leading Them Astray

The Google-owned video site began cleaning it up in January

Nonprofit Mozilla unveiled the YouTube Regrets website Tuesday to share 28 instances where the Google-owned video site’s recommendations engine led people “down bizarre and dangerous paths,” as vice president of advocacy Ashley Boyd put it in an email.

Mozilla said it asked people for their #YouTubeRegrets and received hundreds of responses with tales of content about racism, conspiracies and violence showing up in their recommendations after viewing “innocuous” content.

Examples include: A person who watched a video about Vikings saw white supremacist content in their recommendations; someone who watched confidence-building videos that were posted by a drag queen saw clips of homophobic rants; and a YouTuber who searched for fail videos saw grisly footage from fatal accidents in their recommendations.

YouTube spokesperson Farshad Shadloo said in an email, “While we welcome more research on this front, we have not seen the videos, screenshots or data in question and can’t properly review Mozilla’s claims.

AW+

WORK SMARTER - LEARN, GROW AND BE INSPIRED.

Subscribe today!

To Read the Full Story Become an Adweek+ Subscriber

View Subscription Options

Already a member? Sign in