5 Things You Need to Know About Facebook’s Updated Community Guidelines

No nipples, hate speech, bullying or harassment

The digital media giant is pitching brands with promos via its own platform, out-of-home signage and other venues.
Getty Images

Facebook has released a new, detailed outline of its Community Standards, explaining what the tech giant means when it says things like "no nudity" or "no hate speech." Facebook has enforced these standards a variety of ways over the years, but the social network says it mostly relies on its more than 1 billion users to report violations.   

"It's a challenge to maintain one set of standards that meets the needs of a diverse global community," said a rep for Facebook in a post. "People from different backgrounds may have different ideas about what's appropriate to share—a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standards." 

Given that Facebook is a global network, its standards vary in different parts of the world. Countries alert Facebook when content may break local laws, and the tech giant investigates and restricts those posts when warranted. Facebook then adds those instances to its Global Government Requests Report; during the second half of 2014 there were 35,051 data requests. 

Here are five important guidelines marketers need to know: 

  • Nudity: Facebook will remove posts that show genitals, fully exposed buttocks and some photos of female breasts if the nipple is included. However, contrary to some of Facebook's past behavior, photos of women engaged in breastfeeding or photos showing breasts with post-mastectomy scarring are allowed. Various art forms that depict the female figure are also allowed. 
  • Hate speech: Posts that attacks someone's race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or serious disability or disease will be removed. Facebook does acknowledge that sometimes posts containing hate speech are shared to raise awareness, and in that case, it has encourages users to make the intent clear. 
  • Bullying and harassment: Content that targets private individuals—which Facebook defines as "people who have neither gained news attention nor the interest of the public, by way of their actions or public profession"—with the intention of degrading or shaming them will be removed.  
  • Violence and graphic content: Images that are shared for sadistic pleasure or images that celebrate or glorify violence will be removed. 
  • Self harm: Facebook bans content that champions suicide or any other type of self-injury like self-mutilation or eating disorders. 

See the full outline here