Facebook Is Now Reducing the Reach of Users Who Routinely Share Fake News, Clickbait and Spam

Taking aim at misinformation, sensationalism and bad content

The social network is cracking down on bad actors.
Getty Images

Facebook revealed today that it’s cutting back the reach of users’ posts in the news feed if they’ve proven to regularly post fake news, clickbait articles or spammy pages.

“Our research shows that there is a tiny group of people on Facebook who routinely share vast amounts of public posts per day, effectively spamming people’s feeds,” the company explained in a blog post. “Our research further shows that the links they share tend to include low-quality content such as clickbait, sensationalism, and misinformation. As a result, we want to reduce the influence of these spammers and deprioritize the links they share more frequently than regular sharers. Of course, this is only one signal among many others that may affect the ranking prioritization of this type of post. This update will only apply to links, such as an individual article, not to domains, pages, videos, photos, check-ins or status updates.”

Publishers that see traffic driven by such bad actors could be negatively impacted. They, Facebook said, “may see a reduction in the distribution of those specific links.”

The move follows the Menlo Park, Calif.-based company’s effort in April, when it disrupted a major spam operation that had been filling the social network with fake likes and comments. Facebook has also been tweaking its system to fight clickbait for the last several months.

Outbrain