Facebook’s Monika Bickert Addresses Community Standards, Content Moderation Issues

The biggest challenge for content reviewers is understanding the context of that content

Facebook head of global policy management Monika Bickert sought to provide some clarity on the social network’s community standards and moderation of potentially inflammatory or hurtful content with a long, detailed Newsroom post.

Bickert explained in her post that the biggest challenge for content reviewers is understanding the context of that content, writing:

It’s hard to judge the intent behind one post or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?

She also explained that Facebook’s community standards are updated when necessary, and always available for viewing online, adding that the social network intentionally withholds information on details of its community standards in order to prevent people from discovering ways to work around them.

Bickert wrote that content reviewers are trained using “extreme” hypothetical situations, adding:

We try hard to stay objective. The cases we review aren’t the easy ones: They are often in a grey area where people disagree. Art and pornography aren’t always easily distinguished, but we’ve found that digitally generated images of nudity are more likely to be pornographic than handmade ones, so our policy reflects that.

There’s a big difference between general expressions of anger and specific calls for a named individual to be harmed, so we allow the former but don’t permit the latter.

These tensions—between raising awareness of violence and promoting it, between freedom of expression and freedom from fear, between bearing witness to something and gawking at it—are complicated, and there are rarely universal legal standards to provide clarity. Being as objective as possible is the only way we can be consistent across the world. But we still sometimes end up making the wrong call.

And she addressed the “damned if you do, damned if you don’t” nature of content moderation, writing:

We face criticism from people who want more censorship and people who want less. We see that as a useful signal that we are not leaning too far in any one direction.

Finally, Bickert discussed Facebook’s plans to allocate more resources to these issues:

All of us know there is more we can do. Last month, we announced that we are hiring an extra 3,000 reviewers. This is demanding work, and we will continue to do more to ensure that we are giving them the right support, both by making it easier to escalate hard decisions quickly and by providing the psychological support they need.

Image courtesy of Lightcome/iStock.