Twitter’s Ad Network Is Waging a War Against Anatomy

The Vagina Bible writer took to the platform after her ad was rejected

The company's most salacious brand safety scandal to date is, in many ways, the most infuriating.
Illustration: Dianna McDougall

Twitter has let media buyers promote iffy content, including state-sponsored propaganda, overt white supremacy and obvious scams, in the past, but clinical references to women’s bodies apparently won’t get through.

Per a recent tweet by New York Times contributor Jennifer Gunter, the platform refused to let her publisher run any ads with the word “vagina” to promote her new book, The Vagina Bible. According to Gunter, the promoted tweets could show the book’s cover—with “The Vagina Bible” in bold, black font—but couldn’t use the actual word in the ad.

Twitter’s decision underscores the brand safety concerns digital publishers, including Twitter itself, are wrestling with. The word “vagina” doesn’t appear in Twitter’s “prohibited content” policies for advertisers. What does show up is a ban on “vulgar, obscene or distasteful” content. Twitter’s ban on “adult sexual content,” meanwhile, is only slightly more specific but seems more focused on quashing ads showing naked women rather than those describing women’s bodies.

It’s not shocking these phrases are being flagged. In 2017, an anonymous exec from a female-centric “mainstream content publishing site” told Digiday brands have  increasingly been asking her to tone down the language she uses on her site.

“In the past, it was explicit words like ‘fuck’ or words related to oral sex such as ‘blowjob’—which I think everyone understood,” she said. Now, she added, it’s words that “some people say in front of their children,” which, of course, includes “vagina.”

These less-than-perfect standards are policed by less-than-perfect tech. Most publishers use text-recognition tech to monitor what passes muster on ads and on their sites. The visual-search firm GumGum put out a report in 2018 that less than a fifth of media buyers were even monitoring pictures and videos for brand-safe content. But the tech platforms employing this sort of tech, from Yahoo to Facebook to Youtube, are more focused on stomping out pictures of offending anatomy than the words themselves.

The most likely reason a picture of Gunter’s book title passed muster while the word in question didn’t is that the algorithms Twitter uses to keep its ads in check aren’t doing the job they were designed to do.

“We did not take action on Promoted Tweets from this account because of references to sexual organs as those are permitted within our rules,” a Twitter spokesperson said. “The rejection of some of the promoted content from the account was due to a combination of human error and violations, including the use of profanity and adult products. We have reinstated the tweets we took down and have informed the account owner of the reasons why we blocked the content that violated our ad policies.”

Recommended articles