Facebook’s ‘Remove, Reduce and Inform’ Problematic Content Strategy Has Several New Elements

The social network hosted a group of journalists in Menlo Park last week to share details

Facebook met with a group of journalists last week at its headquarters in Menlo Park, Calif., to share several updates to its “remove, reduce and inform” strategy, which was introduced in 2016 for handling problematic content across its family of applications.

Vice president of integrity Guy Rosen and head of News Feed integrity Tessa Lyons wrote in a Newsroom post last week, “This strategy applies not only during critical times like elections, but year-round.”

Here is an overview of the updates shared by the social network last week.

Facebook

A new section was added to Facebook’s community standards (in English) last week, enabling people to track updates when existing policies are revised and new ones are added. Facebook said it will share specifics on the more substantive changes to its community standards.

Steps are being taken to make group administrators more accountable for community standards violations.

The social network said that starting in the coming weeks, when groups are being reviewed for potential takedowns, admin and moderator content will be examined, including member posts that they have approved.

Facebook is also adding a Group Quality feature globally in the coming weeks to provide people with an overview of content in groups that was removed or flagged, as well as instances of fake news shared within groups.

Rosen and Lyons wrote, “For the past two years, for example, we’ve been working on something called the Safe Communities Initiative, with the mission of protecting people from harmful groups and harm in groups. By using a combination of the latest technology, human review and user reports, we identify and remove harmful groups, whether they are public, closed or secret. We can now proactively detect many types of violating content posted in groups before anyone reports them and sometimes before few people, if any, even see them.”

They also discussed the challenges presented by Stories: “The format’s ephemerality means that we need to work even faster to remove violating content. The creative tools that give people the ability to add text, stickers and drawings to photos and videos can be abused to mask violating content. And because people enjoy stringing together multiple Story cards, we have to view Stories as holistic—if we evaluate individual Story cards in a vacuum, we might miss standards violations.”

Facebook

Facebook will explore more collaborations with outside experts, with Rosen and Lyons pointing out, “Our professional  partners are an important piece of our strategy against misinformation, but they face challenges of scale: There simply aren’t enough professional fact-checkers worldwide and, like all good journalism, fact-checking takes time.”

The social network said it plans to consult “a wide range of academics, fact-checking experts, journalists, survey researchers and civil society organizations” to help understand the benefits and risks of ideas such as groups of Facebook users turning to journalistic sources to corroborate or contradict claims made in potentially false content.

Rosen and Lyons wrote, “We need to find solutions that support original reporting, promote trusted information, complement our existing fact-checking programs and allow for people to express themselves freely—without having Facebook be the judge of what is true. Any system we implement must have safeguards from gaming or manipulation, avoid introducing personal biases and protect minority voices.”

Facebook operations specialist Henry Silverman went into far greater detail in a separate Newsroom post, writing, “We’re going to share with experts the details of the methodology we’ve been thinking about to help these experts get a sense of where the challenges and opportunities are and how they’ll help us arrive at a new approach. We’ll also share updates from these conversations throughout the process and find ways to solicit broader feedback from people around the world who may not be in the core group of experts attending these roundtable events.”

On the subject of fact-checking, The Associated Press saw its role expand, as it began debunking false and misleading video misinformation and evaluating Spanish-language content on Facebook in the U.S. last week.

Facebook said that groups that repeatedly share misinformation rated false by independent fact-checkers began seeing their News Feed distribution reduced last week.

Also on News Feed, a Click-Gap signal added to the ranking process last week weeds out domains with a disproportionate number of outbound Facebook clicks compared with their place in the web graph, described as “a conceptual ‘map’ of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges.”

Facebook said this type of disparity often signals low-quality content.

Facebook

The context button Facebook introduced in the U.S. last April to provide users with additional context on publishers’ articles will soon be expanded to images that were reviewed by third-party fact-checkers. The social network said this is currently being tested in the U.S.

In November 2017, Facebook, Twitter, Google and Bing teamed up with nonpartisan global network of news organizations The Trust Project on Trust Indicators, aimed at helping to validate legitimate news publishers and weeding out fake news.

Facebook began adding Trust Indicators to its context button in March, for English- and Spanish-language content.

The social network’s Page Quality tab, which debuted in January, will soon contain more information, such as pages’ status with respect to clickbait.

When people leave Facebook groups, they will soon be able to remove their posts and comments from those groups.

As for Facebook’s flagship Messenger app, the social network’s verified badge is being extended globally to Messenger in order to avoid impersonation.

Messenger also now boasts messaging settings, enabling people to control who can reach their chats lists among people they are not connected to, such as friends of friends, people who have their phone numbers or Instagram followers.

And the messaging app’s block feature has been updated to help people avoid unwanted contact.

Earlier this year, Facebook added a forward indicator to Messenger, enabling people worldwide to determine if a message they received was forwarded by the sender, as well as the aforementioned context button.