Black Creators File Class-Action Lawsuit Against YouTube

The suit alleges racial discrimination by the platform's algorithm

youtube logo being crushed by a gavel
The lawsuit is the latest tension between YouTube and Black creators. Photo Illustration: Trent Joaquin; Source: Getty Images, YouTube

Four Black female YouTube creators filed a class action lawsuit this week against the video platform’s parent company,Google, alleging the company and its algorithms discriminate against Black creators by removing their content without justification.

The lawsuit is the latest example of the tensions between YouTube and minority creators, and comes as most major tech companies grapple with their role in supporting racial equality.

The federal lawsuit, filed by Kimberly Carleste Newman, Lisa Cabrera, Catherine Jones and Denotra Nicole Lewis, claims the women have earned less money, accumulated fewer subscribers and have had videos deleted because of YouTube’s racial bias.

YouTube is reviewing the complaint, though its automated systems are not programmed to identify creators based on race, sex or ethnicity, a company spokesperson told Adweek, adding that YouTube goes to “extraordinary lengths” to keep the platform neutral.

Google and YouTube recently pledged support for the Black Lives Matter movement, including the creation of a $100 million fund to support Black-owned channels. In the complaint, the plaintiffs criticized the move, asking the companies to instead “spend their money to stop the racist practices that pervade the YouTube platform.”

The lawsuit accuses YouTube of 10 major discriminatory exercises, including applying unnecessary age restrictions, excluding Black creators from trending tabs, and filtering content with keywords like “Black.”

YouTube’s algorithm allows them to “rig the game,” the complain alleges, “by using their power to restrict and block … based on racial identity or viewpoint discrimination for profit.” Google has until July 8 to respond to a court summons.

In August, a group of LGBTQ creators sued YouTube for similar issues of algorithm-driven discrimination. That case is ongoing. 

The case also questions YouTube’s liability protections under Section 230 of the Communications Decency Act, a statute that protects platforms from being held legally responsible for user-generated content on their sites. The law, commonly referred to as the backbone of the internet, also allows private internet companies to moderate and restrict offensive content on their sites, including hate speech and sexual content.

In February, YouTube handily won a case against PragerU, a conservative media company that claimed YouTube violated its First Amendment rights by flagging and removing content that it deemed inappropriate. The First Amendment only applies to government censorship, not speech through a private entity like YouTube. 

Section 230 has come under new scrutiny after Trump turned his ire toward it last month and attempted to undercut its protections after Twitter placed a fact-check label on one of his tweets.

Beyond Google, Facebook COO Sheryl Sandberg announced Thursday that the social network will invest $200 million in Black-owned businesses. LinkedIn CMO Melissa Selcher shared a similar message earlier this week, and said it would elevate Black voices on its feeds and open-source its code to allow other platforms to reduce AI-based inequality.

Rachel Winicov is an intern with Adweek for the summer of 2020 focusing on digital media, ad tech and social media. She is a rising senior at the University of Pennsylvania, where she studies classics. Rachel is from Philadelphia, Pa.
@ScottNover Scott Nover is a platforms reporter at Adweek, covering social media companies and their influence.