Facebook Says the Reach of Fake News Operations During the 2016 Election Was ‘Statistically Very Small’

'The reach of the content shared by false amplifiers was marginal compared to the overall volume of civic content shared during the U.S. election'

Facebook’s efforts to keep its user base informed about the steps it is taking to combat fake news now include a look at the news period that brought the issue to the forefront in the first place: the 2016 U.S. presidential election.

Manager of Facebook’s threat intelligence team Jen Weedon, threat intelligence team analyst William Nuland and chief security officer Alex Stamos published a report titled Information Operations and Facebook, which included a detailed look at the period leading up to Election Day. They wrote:

During the 2016 U.S. presidential election season, we responded to several situations that we assessed to fit the pattern of information operations. We have no evidence of any Facebook accounts being compromised as part of this activity, but nonetheless, we detected and monitored these efforts in order to protect the authentic connections that define our platform.

One aspect of this included malicious actors leveraging conventional and social media to share information stolen from other sources, such as email accounts, with the intent of harming the reputation of specific political targets. These incidents employed a relatively straightforward yet deliberate series of actions:

  • Private and/or proprietary information was accessed and stolen from systems and services (outside of Facebook).
  • Dedicated sites hosting this data were registered.
  • Fake personas were created on Facebook and elsewhere to point to and amplify awareness of this data.
  • Social media accounts and pages were created to amplify news accounts of and direct people to the stolen data.
  • From there, organic proliferation of the messaging and data through authentic peer groups and networks was inevitable.

Concurrently, a separate set of malicious actors engaged in false amplification using inauthentic Facebook accounts to push narratives and themes that reinforced or expanded on some of the topics exposed from stolen data. Facebook conducted research into overall civic engagement during this time on the platform and determined that the reach of the content shared by false amplifiers was marginal compared to the overall volume of civic content shared during the U.S. election.

In short, while we acknowledge the ongoing challenge of monitoring and guarding against information operations, the reach of known operations during the U.S. election of 2016 was statistically very small compared to overall engagement on political issues.

Facebook is not in a position to make definitive attribution to the actors sponsoring this activity. It is important to emphasize that this example case comprises only a subset of overall activities tracked and addressed by our organization during this time period. However, our data does not contradict the attribution provided by the U.S. director of national intelligence in the report dated Jan. 6.

Weedon, Nuland and Stamos defined information operations as “actions taken by organized actors (governments or non-state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome,” and they explained how they broke out the broad term “fake news” into more specific terms:

The term “fake news” has emerged as a catch-all phrase to refer to everything from news articles that are factually incorrect to opinion pieces, parodies and sarcasm, hoaxes, rumors, memes, online abuse and factual misstatements by public figures that are reported in otherwise accurate news pieces. The overuse and misuse of the term “fake news” can be problematic because without common definitions, we cannot understand or fully address these issues.

We’ve adopted the following terminology to refer to these concepts:

  • Information (or influence) operations: Actions taken by governments or organized non-state actors to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome. These operations can use a combination of methods, such as false news, disinformation or networks of fake accounts (false amplifiers) aimed at manipulating public opinion.
  • False news: News articles that purport to be factual, but which contain intentional misstatements of fact with the intention to arouse passions, attract viewership or deceive.
  • False amplifiers: Coordinated activity by inauthentic accounts with the intent of manipulating political discussion (e.g., by discouraging specific parties from participating in discussion, or amplifying sensationalistic voices over others).
  • Disinformation: Inaccurate or manipulated information/content that is spread intentionally. This can include false news, or it can involve more subtle methods, such as false flag operations, feeding inaccurate quotes or stories to innocent intermediaries or knowingly amplifying biased or misleading information. Disinformation is distinct from misinformation, which is the inadvertent or unintentional spread of inaccurate information without malicious intent.

The three authors concluded:

Providing a platform for diverse viewpoints while maintaining authentic debate and discussion is a key component of Facebook’s mission. We recognize that in today’s information environment, social media plays a sizable role in facilitating communications—not only in times of civic events, such as elections, but in everyday expression. In some circumstances, however, we recognize that the risk of malicious actors seeking to use Facebook to mislead people or otherwise promote inauthentic communications can be higher. For our part, we are taking a multifaceted approach to help mitigate these risks:

  • Continually studying and monitoring the efforts of those who try to negatively manipulate civic discourse on Facebook.
  • Innovating in the areas of account access and account integrity, including identifying fake accounts and expanding our security and privacy settings and options.
  • Participating in multi-stakeholder efforts to notify and educate at-risk people of the ways they can best keep their information safe.
  • Supporting civil society programs around media literacy.

Just as the information ecosystem in which these dynamics are playing out is a shared resource and a set of common spaces, the challenges we address here transcend the Facebook platform and represent a set of shared responsibilities. We have made concerted efforts to collaborate with peers both inside the technology sector and in other areas, including governments, journalists and news organizations, and together we will develop the work described here to meet new challenges and make additional advances that protect authentic communication online and support strong, informed and civically engaged communities.

Image courtesy of mrtom-uk/iStock.