Facebook Published Its Internal Content Review Guidelines and Added an Appeals Process

The social network looked to address two longtime gripes by users

Facebook shared an example of the type of content that might be removed in error
Facebook

Facebook’s efforts to restore goodwill with its user base following weeks of controversy touched upon two longtime sore spots with two updates that were announced Tuesday.

The social network has long been criticized for not being transparent about how it decides when content violates its community standards, and vice president of global policy management Monika Bickert announced in a Newsroom post Tuesday that the internal guidelines it uses are now available publicly here.

Facebook broke out those guidelines into six topics: violence and criminal behavior; safety; objectionable content; integrity and authenticity; respecting intellectual property; and content-related requests.

The social network has also faced backlash for removing content without providing any recourse for users who posted that content, and Bickert said Facebook will now provide those users with a way to appeal the social network’s decisions on individual posts.

She provided the example pictured above of the type of post that may be mistakenly removed, along with a description of how the new appeals process will work:

  • If your photo, video or post has been removed because we found that it violates our community standards, you will be notified and given the option to request additional review.
  • This will lead to a review by our team (always by a person), typically within 24 hours.
  • If we’ve made a mistake, we will notify you and your post, photo or video will be restored.

The appeals process will start off with posts that were removed for nudity or sexual activity, hate speech or graphic violence, and Bickert said more violation types will be added.

She also shared more information on Facebook’s content policy team, saying that it has people in 11 offices worldwide, including experts on topics such as hate speech, child safety and terrorism.

As for the social network’s decision to publish its internal guidelines, she wrote, “First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines—and the decisions we make—over time.”

Finally, Bickert noted that the guidelines will be fluid, writing, “In some cases, changes are prompted by input we receive from external stakeholders; in others, we make changes to account for the way language is used; in still others, a change is necessitated by a gap in existing policy. This process will continue—and with it, future updates to our standards. We will be sharing these updates publicly and will be releasing a searchable archive so that people can track changes over time.”

She addressed the updates to Facebook’s appeals process, saying that its community operations team includes more than 7,500 content reviewers—up over 40 percent year over year—and it is working 24 hours per day, seven days per week, in more than 40 languages.

Finally, Bickert discussed instances when content is removed in error, writing, “In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible.”