TikTok rolled out a new notification system in cases where content from creators violates its policies.
The video creation platform said it has been experimenting with the new system for the past few months with the aim of giving creators more clarity on why their content was removed.
TikTok said in a blog post that explaining its enforcement actions and reminding users of its policies has slashed the rate of repeat offenses, and visits to its Community Guidelines page have nearly tripled.
The platform has also seen a 14% decrease in requests from users to appeal video removals.
Under the new system, which rolled out globally this week, when a video is removed, the creator will see exactly which policy was violated and be able to appeal the decision.
And in cases where content is flagged for being related to self-harm or suicide, a second notification will direct the creator of that content to expert resources.
TikTok wrote, “Being transparent with our community is key to continuing to earn and maintain trust. We’re glad to be able to bring this new notification system to all our users, and we’ll keep working to improve the ways we help our community understand our policies as we continue to build a safe and supportive platform.”