YouTube Users and Creators Can Now Report Videos They Feel Were Unfairly Excluded From Restricted Mode

12 million additional videos are now available via the video site's Restricted Mode

YouTube Friday announced more changes to its Restricted Mode, chief among them the availability of 12 million additional videos and a way for creators and viewers to report videos that they believe have been inappropriately excluded.

The Google-owned video site described Restricted Mode as follows:

Restricted Mode is an optional setting that has been available since 2010 and is used by a small subset of users such as libraries, schools and public institutions that choose to have a more limited viewing experience on YouTube.

Last month, YouTube responded to issues over ads being placed adjacent to controversial content and its Restricted Mode blocking content from the LGBTQ (lesbian, gay, bisexual, transgender, questioning/queer) and other communities.

YouTube vice president of product management Ariel Bardin addressed the issue in a blog post at the time, saying that the video site was taking steps including toughening its stance on hate speech, giving advertisers more control over where their ads appear, speeding up the appeals process for creators whose videos are demonetized, rolling out safeguards for Creators and re-emphasizing its commitment to diversity.

Ad agency Havas Worldwide announced earlier in March that it was putting a halt on ad spending on YouTube and Google, parent company of the video site, in the U.K., due to ads appearing alongside content from sources such as white nationalists and terrorists.

Fellow vp of product management Johanna Wright announced Friday’s changes in a blog post, writing:

Back in March, our community alerted us that our systems were not working as intended—in particular, that we were unintentionally filtering content from Restricted Mode that shouldn’t have been. After a thorough investigation, we started making several improvements to Restricted Mode. On the engineering side, we fixed an issue that was incorrectly filtering videos for this feature, and now 12 million additional videos of all types—including hundreds of thousands featuring LGBTQ+ content—are available in Restricted Mode.

We also spent time over the past few weeks talking with creators and third-party organizations to better understand their experiences and questions. One thing we heard loud and clear was people’s desire to report videos they believed were being inappropriately excluded from Restricted Mode. Starting today, we’re providing a form to allow creators and viewers alike to give us feedback about this. We will use this input to help improve our automated system going forward.

Wright also clarified topics that will prevent creators’ videos from being made available via Restricted Mode:

  • Drugs and alcohol: If you’re talking about drug use or abuse, or if you’re drinking alcohol in your videos, your videos will likely not be available in Restricted Mode.
  • Sex: While some educational, straightforward conversations about sexual education may be included in Restricted Mode, overly detailed conversations about sex or sexual activity will likely be removed. This is one of the more difficult topics to train our systems on, and context is key. If your music video features adult themes like sex or drug use, that video will likely not make it into Restricted Mode.
  • Violence: If your video includes graphic descriptions of violence, violent acts, natural disasters and tragedies, or even violence in the news, it will likely not appear in Restricted Mode.
  • Mature subjects: Videos that cover specific details about events related to terrorism, war, crime and political conflicts that resulted in death or serious injury may not be available on Restricted Mode, even if no graphic imagery is shown.
  • Profane and mature language: Inappropriate language including profanity like “F bombs” will also likely result in your video not being available in Restricted Mode.

Image courtesy of smarques774/iStock.