YouTube Reportedly Considering Sweeping Changes as FTC Investigation Looms

Platform could move children's content to standalone app

The move could impact the billions in ad revenue that pour into the platform every year.
YouTube

YouTube is reportedly considering making major changes to how it recommends and classifies children’s content on the platform amid a flurry of concerns about the platform’s safety, The Wall Street Journal reported Wednesday.

The possible changes, according to the Journal, could include moving all of the children’s content on the platform to a separate standalone app for children, called YouTube Kids, or turning off an automatic content recommendation system that queues up new content algorithmically after another video has completed a play-through.

Both moves stand to impact how users, especially young ones, navigate the platform, which could have resulting effects on the advertisers that pour billions into the platform every year to reach those viewers.

Those possible changes come as YouTube faces continued scrutiny for its handling of safety on the platform and as it braces for additional regulatory scrutiny. The Federal Trade Commission is in late stages of investigating the company for how it has handled children’s videos, according to The Washington Post.

A spokesperson for Google said not all of the ideas the platform has for changing YouTube are implemented.

“We consider lots of ideas for improving YouTube and some remain just that — ideas,” the spokesperson said. “Others, we develop and launch, like our restrictions to minors live streaming or updated hate speech policy.”

A spokesperson for the FTC declined to comment.

YouTube, which boasts 2 billion visitors every month, has come under fire repeatedly for its handling of children’s content on the platform. After a YouTube video creator demonstrated how YouTube was “facilitating pedophiles’ ability” to share links to child pornography or to videos of children on YouTube that could be sexualized, the platform suspended comments on videos containing children and tried to put out the fires it caused among the advertising community.

More recently, The New York Times reported on a study that suggested YouTube’s recommendations directed pedophiles toward videos of younger and younger children. The Times has also reported on how YouTube recommendations radicalized a young man by steering him toward extremist content.

Advocacy groups including the Center for Digital Democracy and Common Sense Media have called on the FTC to look into YouTube for possibly violating the Children’s Online Privacy Protection Act, a federal law that limits the collection of information about children under the age of 13.

YouTube created YouTube Kids in 2015 to address some concerns about child safety on YouTube proper. Nonetheless, it’s faced criticism for the kinds of content that has appeared there, as well as for commercializing the children’s app with advertising.

The platform’s handling of children’s content is not the only issue YouTube has faced in recent months. Earlier this month, Vox Media video journalist Carlos Maza called on YouTube to enforce its policies after facing a barrage of abuse on the platform; YouTube first said the abuse did not violate its policies before making an about-face, a decision that left few parties involved satisfied. It’s also repeatedly faced advertiser backlash for running ads before objectionable content.

Mostly, advertisers say they’re hard-pressed to leave the platform unless they face direct backlash for concerns that are primarily directed at YouTube itself. With that said, advertisers have over the last several years been quick to pull or pause ad buys on both YouTube and YouTube Kids over concerns about child safety.

This article has been updated to include a statement from Google.