Facebook Outlines Its Early Efforts to Protect the 2020 US Census

The social network released the second update on its civil rights audit

Facebook will have a full-time team dedicated to protecting the census.
Illustration: Trent Joaquin; Sources: Unsplash, Facebook

Facebook said it plans to treat the 2020 census in the U.S. like an election—presumably more like the 2018 midterm elections, which it took several steps to protect, and less like the 2016 presidential election, which was marred by controversies.

The social network’s plans for 2020 were part of its second update (embedded below) to the civil rights audit that civil rights and civil liberties advocate Laura Murphy began conducting more than one year ago, with support from civil rights law firm Relman, Dane and Colfax. The first update was issued last December.

Chief operating officer Sheryl Sandberg said in a Newsroom post, “With both the U.S. Census and the U.S. presidential elections, 2020 will be big year. An accurate census count is crucial to governments for functions like distributing federal funds and to businesses and researchers. That’s why we’re going to treat next year’s census like an election—with people, policies and technology in place to protect against census interference.”

The social network already has a full-time team in place to protect against election interference, made up of people from its product, engineering, data science, policy, legal and operations teams, and Sandberg said a similar team will be formed dedicated to the census, and policies on census-related misinformation will be crafted.

Facebook will also team up with nonpartisan groups to help promote participation in the 2020 census effort.

Murphy pointed out in her update that the census is instrumental in allocating federal benefits and electoral representation, and minority groups have historically been undercounted.

She said Facebook agreed to extend its existing voter-suppression policies to the census in order to prevent misrepresentation of requirements, methods or logistics for participation.

In early 2020, an expert consultant will provide training to Facebook employees who are responsible for protecting the census.

The social network also said it will use “proactive detection technology” to sniff out census interference, including training its algorithms.

Turning to the 2020 election, Murphy said Facebook is working on expanding its policies to cover forms of voting-related interference that are not currently addressed, working with voting-rights experts both in the U.S. and internationally.

Facebook also said it will have a policy in place prior to the 2019 gubernatorial elections in the U.S. that will ban ads targeting U.S. audiences that encourage people not to vote, and the social network is working with its audit team to better identify and remove ads that appeal to people’s racial identities to try to sway their votes.

Sandberg and Murphy addressed Facebook’s policies on harmful content, with a particular focus on white nationalism.

Sandberg pointed out that Facebook implemented a ban on praise, support and representation of white nationalism and white separatism in March, and Murphy recommended that the social network take further steps on this front.

Murphy wrote that, although March’s policy change was “a noteworthy step forward,” she and the other auditors believe Facebook’s policy is still too narrow, as it only prohibits explicit praise, support or representation of white nationalism and white separatism, while not covering content that supports those ideologies without using the exact terminology.

Her report also touched upon Facebook’s use of its “Dangerous Individuals and Organizations” policy to ban people including Alex Jones, Milo Yiannopoulos, Laura Loomer and Louis Farrakhan from its platform, detailing signals that are evaluated by the social network in arriving at these decisions.

Facebook examines if individuals or organizations call for, advocate, support or carry out acts of violence based on race, ethnicity, religion or similar protected characteristics; if they are self-described or identified followers of hateful ideologies; if they use hate speech or slurs in their profiles; and if they have had Facebook pages or groups or Instagram accounts removed for violating the social network’s hate speech policies.