Facebook Tries to Add Clarity to Its Community Standards

By David Cohen Comment

FacebookHappyFamilyHow does Facebook draw the line between what users are allowed to post and what types of content should be banned? The social network attempted to clarify its policies with Monday’s release of updated community standards.

The revised community standards are divided into four sections:

  • Helping to keep you safe.
  • Encouraging respectful behavior.
  • Keeping your account and personal information secure.
  • Protecting your intellectual property.

For example, Facebook’s policy on images that contain nudity has been called into question in the past, and the social network wrote on the subject:

People sometimes share content containing nudity for reasons like awareness campaigns or artistic projects. We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content — particularly because of their cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards.

We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breast-feeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.

And on hate speech, Facebook clarified:

Facebook removes hate speech, which includes content that directly attacks people based on their:

  • Race
  • Ethnicity
  • National origin
  • Religious affiliation
  • Sexual orientation
  • Sex, gender or gender identity
  • Serious disabilities or diseases.

Organizations and people dedicated to promoting hatred against these protected groups are not allowed a presence on Facebook. As with all of our standards, we rely on our community to report this content to us.

People can use Facebook to challenge ideas, institutions and practices. Such discussion can promote debate and greater understanding. Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others about that hate speech. When this is the case, we expect people to clearly indicate their purpose, which helps us better understand why they shared that content.

We allow humor, satire or social commentary related to these topics, and we believe that when people use their authentic identity, they are more responsible when they share this kind of commentary. For that reason, we ask that page owners associate their names and Facebook profiles with any content that is insensitive, even if that content does not violate our policies. As always, we urge people to be conscious of their audience when sharing this type of content.

While we work hard to remove hate speech, we also give you tools to avoid distasteful or offensive content. Learn more (https://www.facebook.com/help/359033794168099/) about the tools we offer to control what you see. You can also use Facebook to speak up and educate the community around you. Counter-speech in the form of accurate information and alternative viewpoints can help create a safer and more respectful environment.

Facebook head of global policy management Monika Bickert and deputy general counsel Chris Sonderby offered more details in a Newsroom post:

Today we are providing more detail and clarity on what is and is not allowed. For example, what exactly do we mean by nudity, or what do we mean by hate speech? While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.

There are also times when we may have to remove or restrict access to content because it violates a law in a particular country, even though it doesn’t violate our community standards. We report the number of government requests to restrict content for contravening local law in our global Government Requests Report, which we are also releasing today. We challenge requests that appear to be unreasonable or overbroad. And if a country requests that we remove content because it is illegal in that country, we will not necessarily remove it from Facebook entirely, but may restrict access to it in the country where it is illegal.

Billions of pieces of content are shared on Facebook every day. We hope these two updates help provide more clarity about the standards we have, whether they are our own community standards or those imposed by different laws around the world.

In particular, we’ve provided more guidance on policies related to self-injury, dangerous organizations, bullying and harassment, criminal activity, sexual violence and exploitation, nudity, hate speech and violence and graphic content. While some of this guidance is new, it is consistent with how we’ve applied our standards in the past.

It’s a challenge to maintain one set of standards that meets the needs of a diverse global community. For one thing, people from different backgrounds may have different ideas about what’s appropriate to share — a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standards.

This is particularly challenging for issues such as hate speech. Hate speech has always been banned on Facebook, and in our new community standards, we explain our efforts to keep our community free from this kind of abusive language. We understand that many countries have concerns about hate speech in their communities, so we regularly talk to governments, community members, academics and other experts from around the globe to ensure that we are in the best position possible to recognize and remove such speech from our community. We know that our policies won’t perfectly address every piece of content, especially where we have limited context, but we evaluate reported content seriously and do our best to get it right.

If people believe pages, profiles or individual pieces of content violate our community standards, they can report it to us by clicking the “Report” link at the top, right-hand corner. Our reviewers look to the person reporting the content for information about why they think the content violates our standards. People can also unfollow, block or hide content and people they don’t want to see, or reach out to people who post things that they don’t like or disagree with.

While the community standards outline Facebook’s expectations when it comes to what content is or is not acceptable in our community, countries have local laws that prohibit some forms of content. In some countries, for example, it is against the law to share content regarded as being blasphemous. While blasphemy is not a violation of the community standards, we will still evaluate the reported content and restrict it in that country if we conclude it violates local law.

And Facebook chief operating officer Sheryl Sandberg chimed in with her own post:

Every day, almost 1 billion people come to Facebook to share the things that matter to them — from milestones like births and weddings to everyday moments with friends. On Facebook, people feel safe being themselves. That’s what makes our community possible, and it’s something we’re always thinking about. Keeping you safe on Facebook is our top priority.

Today we’re publishing more details about our community standards to help people understand what is and isn’t OK to share. We created the standards to keep people safe and encourage respect. We’re also releasing our global Government Requests Report, which lists government requests to have content that’s illegal in their countries removed or restricted. (Facebook co-founder and CEO Mark Zuckerberg) shared some of his thoughts today about how we handle these situations and our responsibility to the people who use Facebook around the world.

We’re going to keep working hard to make sure people feel safe on Facebook. Thank you for being a part of the Facebook community.

Readers: What did you think of the new information in Facebook’s community standards?

Advertisement
Advertisement