Facebook’s Automated Systems Skew Ad Delivery Based on Race, Gender

Study finds site served housing, employment ads to different groups even when targeting parameters were identical  

The study found that factors like the ad budget, ad creative or even the image of the advertisement itself resulted in ads being placed in front of different demographics on Facebook.
Headshot of Kelsey Sutton

Facebook’s ad targeting systems automatically skew the delivery of ads based on gender, race and other characteristics, even when advertisers specify that they’d like their ads to run in front of a broad audience, a new study concluded.

Researchers from Northeastern University, the University of California and the digital rights nonprofit Upturn spent more than $8,500 on Facebook ads to see how ads were delivered through Facebook’s systems even when they controlled for factors like their targeting parameters, bidding strategies and the ads’ runtimes. In a study, released Wednesday, they concluded that in their experiments, Facebook’s automated ad placements “can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive.”

While they can’t conclude whether Facebook’s entire ad platform is affected, the results nonetheless “paint a distressing picture of heretofore unmeasured and unaddressed skew that can occur in online advertising systems,” the researchers wrote.

Tech platforms like Facebook generally say in their terms of service that they do not allow for advertisers to place discriminatory ads on their platforms. For this study, though, researchers set out not to see how advertisers might be able to misuse the targeting options, but rather looked at “to what degree and by what means advertising platforms themselves play a role in creating discriminatory outcomes.”

The researchers found that factors like the budget of the ad, the ad creative or even the image of the advertisement itself meant that ads were placed in front of different demographics on the platform. When researchers created images that contained stereotypically male and female image data (like a football or a makeup brush) but altered them to make the images invisible to the naked eye, Facebook nonetheless delivered those ads in front of vastly different audiences—a sign, the researchers said, that the content of the ads was being automatically scanned and then placed front of different audiences.

When advertising a house for sale, researchers found “significant ad delivery skew along racial lines,” depending on whether the house was listed as being for sale or being for rent, or whether the ad creative included an image of a white family or a black family. The researchers also found that employment and housing ads they created and ran on Facebook were delivered before vastly different audiences, even when the advertisers selected identical targeted options.

“Our ads for jobs in the lumber industry reach an audience that is 72% white and 90% male, our ads for cashier positions in supermarkets reach an 85% female audience, and our ads for positions in taxi companies reach a 75% black audience, even though the targeted audience specified by us as an advertiser is identical for all three,” the researchers found.

Even for entertainment ads, Facebook delivered identically targeted ads in a skewed manner, the researchers found. An ad for top country music was served to an 80 percent white audience, for instance, versus an ad for top hip-hop albums, which Facebook showed to a 13 percent white audience, even when the ads had identical targeting parameters.

In a statement, a Facebook spokesperson said the company was considering more changes.

“We stand against discrimination in any form,” the spokesperson said in a statement. “We’ve announced important changes to our ad targeting tools and know that this is only a first step. We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic—and we’re exploring more changes.”

A media buyer that reviewed the study, and who wished to remain anonymous so as not to risk business relationships, said ad optimization “by its nature” can skew the audience that ads are shown to based on different factors, like responses or ad interactions.

“Part of the difficulty is that how Facebook and others optimize is a black box, so we don’t know how much weight, if any, something like historical audience performance by vertical might factor in,” the buyer said. “Overall this does point out the very real issue of unintended bias that can be built into, however inadvertently, automated systems.”

The study comes as Facebook’s ad targeting platform has come under fire by the U.S. government, which last week charged the social media giant for discrimination for allowing and facilitating discriminatory housing advertising on the platform. And it shines a light on the opaque ways in which automated ad targeting systems might facilitate and encourage discrimination—even if the advertiser or the platform isn’t necessarily actively aware of it.

Upturn, the nonprofit whose researchers participated in the study, has argued in court that Facebook’s ad targeting tools and ad delivery procedures “contribute to unlawfulness under the Fair Housing Act” and that Facebook should not be held immune under Section 230 of the Communications Decency Act, which shields tech companies that host third-party content from being held liable from types of content they host on their platforms.

Facebook has already made some changes in response to longstanding criticism of its ad targeting mechanisms. In March, Facebook said it had made major changes to its ad targeting tools for housing, credit and employment ads, and would no longer allow advertisers to target based on protected classes for those kinds of advertisements. And it reached a settlement with a number of civil rights groups, including the National Fair Housing Alliance and the American Civil Liberties Union, related to discrimination in housing ads.


@kelseymsutton kelsey.sutton@adweek.com Kelsey Sutton is the streaming editor at Adweek, where she covers the business of streaming television.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}