Facebook Cracks Down on QAnon Content

The conspiracy group is poised to send a believer to Congress later this year

The Facebook logo and QAnon supporters
Facebook took decisive action on QAnon Wednesday. Getty Images, Facebook
Headshot of Scott Nover

Facebook took decisive action today, booting accounts promoting QAnon, an unfounded conspiracy theory that believes a secretive cabal of government employees is trying to undermine President Donald Trump.

In all, Facebook banned 790 groups, 100 pages, 1,500 ads and blocked 300 hashtags across Facebook and Instagram, the company said in a statement. It also imposed restrictions on an additional 1,950 groups and 440 pages on Facebook and more than 10,000 accounts on Instagram.

The Facebook purge was part of a larger effort that also affected “offline anarchist groups that support violent acts amidst protests [and] U.S.-based militia organizations.”

The move was announced as the social media company said it would expand its policy on so-called dangerous individuals and organizations to include those groups that show “significant risks to public safety,” but aren’t considered a dangerous organization and banned “from having any presence.”

Facebook will still allow users to post content supportive of such movements and groups, as long as it doesn’t violate site rules, but “restrict their ability to organize.” 

In May, Facebook removed 20 accounts, six groups and five pages linked to QAnon as part of its ban on “coordinated inauthentic behavior,” a policy outlawing disinformation networks. Earlier this month, Facebook banned Official Q/Qanon, one of its largest QAnon groups with 200,000 members, for bullying, harassment and hate speech.

But today’s decision more closely resembled one that Twitter took last month when it banned 7,000 accounts, prohibited QAnon content in Trends and blocked URLs associated with the movement. Twitter said the group routinely contributed to real offline threats and harassment. TikTok recently removed QAnon hashtags and Reddit booted QAnon subreddits in 2018.

Facebook also cracked down on “militia organizations and those including riots, including some who identify as Antifa,” a loose collective of anti-fascist activists heavily criticized by Trump. The social media company said it removed more than 980 groups, 520 pages, and 160 Facebook ads and restricted more than 1,400 hashtags on Instagram.

Facebook said it would limit and restrict page, group and account recommendations for the affected groups, reduce their reach of the news feed and search and prohibit them commerce and monetization, which includes advertising, shopping and fundraising.

Additionally, on Instagram, the company said it has temporarily removed the “related hashtags” feature, which recommends hashtags similar to the ones people are using, and are “working on stronger protections for people using this feature.”

Facebook has slowly cracked down on content moderation this summer under pressure from civil rights groups, legislators and a massive advertiser boycott, as well as competitors like Twitter and Reddit renewing efforts to root out hate speech and misinformation.

Facebook has also taken action against the boogaloo movement, removed Trump campaign ads featuring Nazi iconography and released a long-delayed civil rights audit

QAnon has gained steam in recent months, moving from the darker corners of the internet —namely message boards like 8kun—to an unavoidable reality of modern politics. Far from the days of Pizzagate—an early warning of the offline harm this group could do—the president now routinely retweets QAnon accounts. An unabashed supporter, Majorie Taylor Greene, is the Republican nominee for Congress in Georgia’s deep-red 14th District.


@ScottNover scott.nover@adweek.com Scott Nover is a platforms reporter at Adweek, covering social media companies and their influence.
{"taxonomy":"default","sortby":"default","label":"","shouldShow":"on"}