Facebook to Ban Ads That Delegitimize the Election or Promote QAnon Conspiracy Theories

Policies continue to change in the lead up to November

instagram, twitter and facebook logo in a pile of money
Facebook has a new ad policy. iStock/Getty Images
Headshot of Scott Nover

Ahead of the November elections, Facebook is continuing to tweak its advertising policies. 

The social media company announced Wednesday evening that it will ban ads that executives believe “delegitimize the outcome of an election” and will formally ban ads that promote QAnon conspiracy theories.

Ads on Facebook and Instagram that call voting methods “inherently fraudulent or corrupt or [are] using isolated incidents of voter fraud to delegitimize the result of an election,” will also not be allowed, said Facebook director of product management Rob Leathern.

This is the latest change Facebook has made to its political ads policies. It recently allowed users to opt out of seeing political ads and announced it will reject ads that prematurely claim victory after the election. It had also announced it would stop running new political ads a week before Election Day, though critics said this could hurt get-out-the-vote efforts.

The new rule does not outlaw organic posts that also seek to undermine the election, though the company indicated earlier this month that it will add an “information label” on these types of posts. Any posts suggesting “people will get Covid-19 if they take part in voting” will be banned outright.

While some competitors like Twitter have banned political ads entirely, Facebook does not fact-check political ads or limit microtargeting. 

Facebook also took steps to curb the spread of the QAnon posts with this latest announcement by no longer accepting ads related to the conspiracy theories. QAnon has previously been linked to offline crimes including murder and kidnapping.

Facebook has taken some steps recently to limit the spread of QAnon content on its platform before this announcement.

In May, Facebook took down eight disinformation networks—20 Facebook accounts, six groups and five pages—under its “coordinated inauthentic behavior” policy.  In August, the company went even further, removing 790 groups, 100 pages, 1,500 ads and blocked 300 hashtags across Facebook and Instagram related to QAnon. It also restricted an additional 1,950 groups and 440 pages on Facebook and more than 10,000 accounts on Instagram.

In an update to this policy, Facebook said this week that it would ban ads that “praise, support or represent militarized social movements and QAnon.”

“We are taking steps to address evidence that QAnon adherents are increasingly using the issue of child safety and hashtags like #savethechildren to recruit and organize,” Facebook said in a statement.

Now, according to Facebook’s policy, when users search or use similar hashtags, it will direct them to “credible child safety resources.” Additionally, it will now apply its third-party fact-checking procedures to QAnon and child safety content on the platform and will label and limit the spread of any previously debunked stories.


@ScottNover scott.nover@adweek.com Scott Nover is a platforms reporter at Adweek, covering social media companies and their influence.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}