Facebook Is Opting for Context Over Labeling in Its Battle Against Fake News

Disputed Flags are out and Related Articles are in

Facebook continued down the path of directing, not deleting IvelinRadkov/iStock

Facebook is trying a new tactic in its balancing act between trying to stop the spread of misinformation and not being seen as a Big Brother-type censor of its users’ content.

The social network announced last December that when posts were reported by users as questionable and identified as fake by third-party fact-checking organizations, those posts would be flagged as disputed, and while they could still be shared, they would contain warnings and would not be eligible to be part of promoted posts or ads.

Vice president of News Feed Adam Mosseri wrote at the time, “We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with third-party fact-checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact-checking organizations identify a story as fake, it will get flagged as disputed, and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.”

According to Facebook, those demotions in News Feed have worked, with the social network saying that articles suffering that fate “typically lose 80 percent of their traffic,” removing the economic incentives that lead spammers and troll farms to post the content in the first place.

Facebook took further aim at those economic incentives in August, announcing that pages that repeatedly share fake news stories would be banned from advertising altogether until those practices were curbed.

Now, Disputed Flags are out and Related Articles are in.

The social network began testing an “i” button next to select articles in News Feed in October that provides users with additional context on those articles, including Related Articles.

Director of product management Andrew Anker, News Feed product manager Sara Su and product designer Jeff Smith said in a Newsroom post at the time: “This new feature is designed to provide people some of the tools they need to make an informed decision about which stories to read, share and trust. It reflects feedback from our community, including many publishers who collaborated on its development as part of our work through the Facebook Journalism Project.”

Why the reversal? Product manager Tessa Lyons said in a Newsroom post this week that “academic research” has found that strong images, such as red flags next to articles, may actually counteract what they were intended to do by entrenching “deeply held beliefs.”

Conversely, Related Articles are designed to give readers more context, and Lyons said that Facebook has found that showing Related Articles next to questionable posts leads to less sharing than when the Disputed Flag is included next to posts.

Lyons also said Facebook is launching a new initiative to help it better understand how its users decide whether or not information is accurate based on the news sources they depend on most, adding that News Feed will not be directly impacted in the near-term.

david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.