Why Leaked Moderation Documents Are the Best Thing to Happen to Social Media

Opinion: Facebook can and should lead the charge by supporting its human-led moderation with better technology

In most cases, especially today, anything leaked is usually the beginning of a series of unfortunate and very public events that ends with litigation, extreme embarrassment, firings or all three.

However, the very recent leak of 100 manuals covering Facebook’s moderation policy may turn out to be a really positive thing.

It not only demonstrates just how Herculean this problem is, but also underscores that this is the No. 1 problem that Facebook needs to innovate around and solve. Who stands to benefit? Quite frankly, everyone who uses social media, consumers and brands alike.

Social media has undoubtedly and unfortunately made it easier for people to spread hate speech, abuse, threats, terrorist content and yes, even revenge porn. With 1.284 billion daily users, Facebook now has a monumental task on its hands.

The Guardian’s reveal of Facebook’s moderation guidelines shows the delicate balance of the social network’s interest in protecting freedom of speech while also providing a safe space for roughly one-quarter of the world’s population to come together every month.

Social media moderation is a difficult job even in the best of times. The Guardian reported that Facebook moderators are concerned about the complexity of the guidelines they follow. Facebook doesn’t ban all sexual content. It doesn’t forbid objectionable content or all disturbing videos. Facebook is regularly criticized for deleting content that shouldn’t be seen as objectionable, such as mothers breastfeeding their babies. It’s even been accused of trying to erase history by censoring the Vietnam War photo of a naked and crying child.

Sure, Facebook did create a thoughtful and detailed set of guidelines to try to provide moderation advice on a wide range of issues, language and image use. Yet with 54,000 cases of revenge porn alone in a single month, it’s clear that moderators have little time to react to each and every piece of content, and much more needs to be done.

Facebook’s ‘Mount Everest’ challenge

Moderation isn’t censorship, or even about policing content posted online. But moderators and the brands that use them need to tread an incredibly fine line between defending freedom of expression and protecting the members of an online community.

Facebook moderators need to adhere to their own internal guidelines, ensure that content obeys the law and provide users with a safe space that also encourages freedom of expression and debate. Yet we as humans all find different things offensive, and what one person may see as banter another may classify as bullying.

Of course, the issue becomes even more complex when you add offerings like Facebook Live into the mix.

Innovative moderation: Facebook’s real opportunity

As the leader of the space, Facebook can and should lead the charge by supporting its human-led moderation with better technology. It must invest in machine learning to assist its moderation team. This could act as the initial stage of moderation that all flagged content goes through before being sent to a moderator for review. While most machine learning is currently English-only, there’s no reason why Facebook couldn’t develop it to handle scores of global languages.

No matter how good the moderator, human beings aren’t machines. If rules are too complex, if we have too many variables to remember in a short amount of time (and we’re dealing with vast amounts of content), our split-second judgments may not always be right.

A program developed to support moderators can process the most obvious infractions of guidelines or laws and reduce the volume of content needed to be individually processed by humans.

Google and IBM are pioneering machine learning efforts, but this latest “leak” brings to the forefront a tremendous opportunity for Facebook to devote its engineering resources to innovate in moderation and social media safety.

Whether it embraces the challenge or not, Facebook, as the social network used by literally billions, has a huge responsibility. It needs to protect free speech, yet also keep us safe from abuse and illegal content. Time will tell if it can step up and provide proper moderation at scale.

Tamara Littleton is CEO of social media agency The Social Element.

Image courtesy of RapidEye/iStock.