Mozilla Releases RegretsReporter Browser Extension to Analyze YouTube Recommendations Engine

It is available for Firefox, Google Chrome

Mozilla urged users not to modify their behavior on YouTube while using RegretsReporter Mozilla
Headshot of David Cohen

Mozilla continued to take YouTube’s recommendations engine to task, this time in the form of an extension for its Firefox browser and Google Chrome.

The nonprofit released the RegretsReporter extension this week with the aim of crowdsourcing research into issues with YouTube’s recommendations engine and better understanding why those issues exist.

Mozilla vice president of advocacy and engagement Ashley Boyd said in a blog post, “YouTube recommendations can be delightful, but they can also be dangerous. The platform has a history of recommending harmful content—from pandemic conspiracies to political disinformation—to its users, even if they’ve previously viewed harmless content.”

Here’s how RegretsReporter works: As users browse YouTube, the extension will automatically send data to Mozilla about time spent on the platform, without sharing information on what those users are watching or searching.

Users can also opt into sending Mozilla a report, where they will be asked for more information on their YouTube regrets, including the video they are reporting and how they arrived at that video.

Mozilla stressed that any data collected is linked to a randomly generated user ID, and not users’ YouTube accounts, and that whatever information it discloses from the data it collects will be shared in a way to minimize the risk of users being identified.

RegretsReporter does not collect any data if the user is in a private browsing window.

Mozilla urged users not to modify their behavior on YouTube while using RegretsReporter, such as by going out of their way to search for regrettable content, adding, “Use YouTube as you normally do. That is the only way that we can collectively understand whether YouTube’s problem with recommending regrettable content is improving, and which areas they need to do better on.”

The nonprofit said it will use the information it collects to work with journalists, policymakers, researchers and YouTube engineers to build more trustworthy systems for recommending content, and it will publicly share findings from its research.

Mozilla granted a fellowship to former YouTube engineer Guillaume Chaslot in July 2019 to investigate artificial intelligence systems, including the YouTube recommendations engine.

The nonprofit unveiled a website last October, YouTube Regrets, documenting 28 instances where the Google-owned video site’s recommendations engine led people “down bizarre and dangerous paths.”

And in July, Mozilla introduced TheirTube, a project in which it created recommendation bubbles for six different personas, based on interviews with actual YouTube users.

Boyd wrote, “Despite the serious consequences, YouTube’s recommendation algorithm is entirely mysterious to its users. What will YouTube be recommending that users in the U.S. watch in the last days before the election? Or in the following days, when the election results may not be clear?”


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
{"taxonomy":"default","sortby":"default","label":"","shouldShow":"on"}