YouTube Is Working With Wikipedia to Add Context to Conspiracy Videos

CEO Susan Wojcicki talks about balancing freedom with tech platforms

Mark your calendar for Mediaweek, October 29-30 in New York City. We’ll unpack the biggest shifts shaping the future of media—from tv to retail media to tech—and how marketers can prep to stay ahead. Register with early-bird rates before sale ends!

The past 18 months have been tough for YouTube. Between discovering Russia-linked propaganda ads and dealing with brand safety issues that are a result of the platform’s sophisticated programmatic advertising, YouTube is trying to clean up its act and assure both consumers and brands that it’s a clean space.

In its newest effort to crack down on conspiracy videos that spread misinformation, YouTube has an interesting partner: Wikipedia. Videos that propagate theories like the moon landing will now include a short blurb from Wikipedia providing facts that counter the theory, said YouTube CEO Susan Wojcicki during a panel at South by Southwest.

“If there’s something that’s happening in the world and there’s an important news event, we want to be delivering the right set of information, and so we felt like there was responsibility for us to do that, and for us to do that well,” Wojcicki said. “That’s what we [did] a year ago, but I think what we’ve seen is that it’s not enough—there still continues to be a lot of misinformation out there.”

Over the next couple of weeks, YouTube will address videos that focus on “well-known conspiracies” by placing “information cues”—text from Wikipedia with information about the subject—below the videos. Wojcicki did not say how many videos are flagged as conspiracies on the platform but used a video titled “5 Most Believed Apollo 11 Moon Conspiracies” as an example of how it will use the Wikipedia unit as a companion to the video.

The fact that Wikipedia is an open-source site where anyone can edit facts could be a problem. Users are anonymous on Wikipedia, and editors of pages can often go back and forth over citing sources on divisive content. Wojcicki did not detail how fact-checking between Wikipedia and YouTube will work.

At the same time that YouTube wants hammer down on vetting content, Wojcicki was clear that YouTube “is not a news organization,” adding that the company does not have fact-checkers. That means the onus is on YouTube to vet publishers and figure out “the authoritativeness and reputation of that publisher,” she said.

During the hourlong talk, Wired editor in chief Nicholas Thompson repeatedly pushed for Wojcicki to explain where YouTube draws the line between serving as a platform for free speech and ethically addressing extremist content that’s often served to users through recommendations. When Thompson asked if YouTube would be willing to make a change to its recommendations that lead to people spending less time on the platform—and therefore, watching fewer ads— Wojcicki avoided the question by saying that the platform wants to “do the right thing for the users.”

“Doing the right thing to the user is easy to say but actually hard to do. … The question is what are the metrics that we should be optimizing in the future in addition to the satisfaction of the users?” Wojcicki said.

Earlier in the talk, Wojcicki compared YouTube’s role in media and content ownership to a library, based on “the sheer amount of video that we have and the ability for people to learn and look up any kind of information.”

Libraries are also traditionally controversial places that celebrate banned books, she said. “There has always been in the history of information some set of information that people think other people shouldn’t have.”

The bottom line: It’s tricky to balance freedoms with a platform that has over one billion users who watch one billion hours of content a day. While YouTube believes in certain freedoms—expression, information and opportunity, for example—“this year has shown that sometimes, those freedoms are in conflict with each other,” Wojcicki said.