Just weeks after being hit with a record $5.7 million fine from the Federal Trade Commission for collecting personal information from children under the age of 13 without parental consent, TikTok finds itself in hot water yet again, this time for more serious issues related to kids across the pond.
Marco Silva of BBC Trending reported today that the BBC collected hundreds of sexual comments that were posted on TikTok videos uploaded by teenagers and children over a three-month period, and it reported those comments to TikTok using the app’s tools.
It found that while the majority of those comments were removed within 24 hours of being reported, some were not, and the “vast majority” of accounts responsible for those comments were still active on TikTok.
A TikTok spokesperson said, “We are pleased that the BBC investigation recognized that we have managed to remove the majority of the reported comments. These findings reflect the progress of many of our more recent changes to fight against misuse. The work is never done on our end.”
BBC Trending also said that during the course of its investigation, it identified several users who repeatedly approached teenage girls and posted sexually explicit messages on their videos, as well as instances where threatening or violent message were sent to children via the app.
The TikTok spokesperson said the company uses a combination of policies, technology and moderation to detect and review problematic content and accounts and penalizes them when appropriate, with actions ranging from restricting certain features to banning account access altogether, depending on severity and frequency.
TikTok would not disclose the number of moderators it employs, but its spokesperson added, “We have a dedicated and growing team of human moderators to manually cross-review tens of thousands of videos and accounts, and we constantly roll out internal training and processes to improve moderation accuracy and efficiency. While these protections won’t catch all instances of misuse, we’re committed to improving and enhancing our protective measures, and we use learnings like these to continually hone our moderation efforts.”
TikTok also detailed changes to the moderation practice meant to improve accuracy and provide more context to moderators, such as attaching videos next to comments, as well as its addition of text classifiers to better train its artificial intelligence technology in recognizing predatory and vulgar behavior.
And the company pointed to safety features that have been added since the app’s debut, including the ability to restrict who can comment on posts and to filter keywords from comments. It also now gives parents the means of placing usage time limits on the app for their kids and setting their accounts to restricted viewing mode, which prevents them from seeing age-inappropriate content.
In response to the FTC fine, an update to TikTok in late February required users in the U.S. to verify their birthdays and directed those under 13 to a separate, more restricted experience that does not permit them to upload videos and protects their personal information. They can like content and follow users, but only create and save videos to their own phones. In the U.K., if an age under 13 is submitted, TikTok automatically blocks the user.
According to the BBC, TikTok has no plans to extend that age-verification feature to the U.K., which did not sit well with Damian Collins, member of Parliament and chairman of the Commons Digital, Culture, Media and Sport Committee.
He told the BBC, “We need to have robust age-verification tools in place. The age policies are meaningless if they don’t have the ability to really check whether people are the age or not. We’ve been discussing content regulation with a number of different social media companies and will certainly be taking a good look at what’s been happening at TikTok.”
The spokesperson for TikTok replied, “Age-based access is a topic that is important to many platforms, including TikTok. Together with our industry peers, we participate in the conversation with experts and third-party organizations to explore future solutions to address this challenge.”
Earlier this week, TikTok joined nonprofit organization Internet Matters “to empower parents and caregivers to keep young people safe in the digital world.”