The Amount of Fake Accounts Removed From Facebook This Year Nearly Equals Its Number of Total Active Users

Company disabled 2.2 billion in the first quarter of 2019

Facebook CEO Mark Zuckerberg told reporters Thursday that bad actors were increasingly trying to create batches of fake accounts on the site.
Amy Osborne — AFP/Getty Images

Facebook disabled more than 2 billion fake accounts on Facebook in the first quarter of 2019 alone, nearly the same amount of total monthly active users the company has, the company said Thursday in a report about how it is enforcing its platform rules.

The company pulled almost 2.2 billion fake accounts in the first quarter of 2019, compared to the nearly 2.4 billion monthly active users it reported in the same time period. In the last quarter of 2018, Facebook pulled 1.2 billion accounts from the site for appearing to be fake.

All in all, Facebook has removed more than 3 billion fake accounts in the last six months. Five percent of Facebook’s monthly active users are fake, the company said.

The staggering number from Facebook is an indication of the sheer scale of the challenges the company faces as it tries to tackle bad behavior in its various forms. The numbers were released Thursday as part of Facebook’s Community Standards Enforcement report, which details information about the kinds of content Facebook has taken action on.

Facebook downplayed the sheer number of fake accounts, saying that “most” of those accounts were caught before they were counted as active Facebook users. And in a blog post Thursday morning, Facebook said looking at the raw number of fake accounts on Facebook “may be a bad way to look at things,” though reporting those figures has become an industry standard.

During a call with reporters Thursday, Facebook CEO Mark Zuckerberg said Facebook was taking down more fake accounts than ever due to an increase in automated attempts to create huge swaths of fake accounts at once.

“Most of these accounts were blocked within minutes of their creation, before they could do any harm, and they were never considered active within our systems,” Zuckerberg said.

The enforcement report, which Facebook said it would soon release quarterly and would also include data from the Facebook-owned platform Instagram, detailed how Facebook is stacking up in terms of addressing and enforcing its community standards.

Facebook vice president of integrity Guy Rosen summarized the report’s findings on the call, telling reporters that out of every 10,000 times people viewed content on Facebook, 11 to 14 of those views contained adult nudity and sexual activity that violated Facebook’s policies, and that 25 of those views contained graphic or violent content that violated Facebook’s policy.

The report broke down the amount of content across nine different categories on which Facebook took action, as well as the number that Facebook identified automatically, instead of relying on a user report. Facebook has used automation to detect more violating content than before, Facebook said; the company detected around 65% of posts violating rules against hate speech without requiring a user report, up from 24% more than a year prior. In total, the company has pulled 4 million posts for violating hate speech.

Facebook is also now reporting the amount of content it has identified and pulled for violating the company’s regulated goods rules, which prohibit the sale of illicit drugs or firearms. Facebook says it’s automatically identifying more of those posts than before.

The data also showed that Facebook restored small portions of the content that it took action on after appeals and through regular Facebook moderator review. Facebook took action on more than 1.7 billion pieces of content it determined violated its rules against spam, but those take-downs resulted in nearly 21 million appeals and more than 40 million posts being restored with and without appeals.

Facebook

“The system will never be perfect, and there will always be people who will disagree with us—they’ll think we’ve gone too far, or think we’ve not gone far enough,” said Monika Bickert, Facebook’s vp of global policy management. “But we will continue to develop thoughtful policies that balance voice and safety.”

Facebook said that its move toward more encrypted content could impact its ability to track and ultimately tackle kinds of violating content, like hate speech or sexual content.

“There is a trade-off between protecting privacy and protecting safety, and that’s something that every society grapples with,” Rosen said. “We do believe encryption is an incredibly powerful tool for privacy, and we are working to detect bad actors through things like identifying patterns of bad activity. We’re building better tools for people to report bad content to us.”

Rosen nodded to how Facebook has attempted to address the viral spread of misinformation on the encrypted messaging app Whatsapp, which Facebook acquired in 2014. Later during the call, Zuckerberg said that the move to encryption would not, however, affect the company’s ad business, which raked in nearly $15 billion in advertising revenue in the first quarter of 2019 alone.

“As we move toward encryption, it’s actually not going to impact our practices on the advertising side,” Zuckerberg said.

After facing a question about calls to break up Facebook from politicians like Sen. Elizabeth Warren and Facebook founder Chris Hughes, Zuckerberg said that Facebook isn’t big enough to be considered a monopoly.

“I think it almost goes without saying that we exist in a very competitive and dynamic environment where new services are constantly coming up,” Zuckerberg said. “I think arguments where were in some sort of dominant position there might be a little stretched.”

At the same time, Zuckerberg argued that Facebook’s vastness made it capable of investing in content moderation, making a break-up ineffective for addressing concerns about misinformation. (Facebook earlier announced that it was going to raise the wages of content moderators in metro areas around the country.)

“If the problems you are most worried about are ones about making sure that we address harmful content, making sure that we address election interference, making sure that we have the right privacy tools and at the same time that people have the ability to bring their data to other services for innovation and competition and research … I don’t think that the remedy of breaking up the company is going to address them,” Zuckerberg said. “I think it’s just going to make it a lot harder. …We’re able to do things that I think are just not possible for other folks to do.”

Recommended articles