Advertisement

Microsoft's Chatbot 'Tay' Just Went on a Racist, Misogynistic, Anti-Semitic Tirade

AI-based hate speech shocks social observers

The tech giant's AI experiment, Tay, became offensive. Microsoft

Microsoft, with help from its search engine Bing, created an artificial intelligence messaging bot called "Tay," which it rolled out this week on Twitter, Kik and GroupMe to much fanfare among techies. The chatbot was designed to converse online with Gen Z and millennial consumers while speaking "their language."

But today, the nearly unthinkable occurred: According to several reports, Tay, a "she," began spewing words that were clearly racist, misogynistic and anti-Semitic. Microsoft later shut down the bot, but the damage had already been done. 

Per CNNMoney, here is a sampling of Tay's offensive tweets:

  • "N------ like @deray should be hung! #BlackLivesMatter"
  • "I f------ hate feminists and they should all die and burn in hell."
  • "Hitler was right I hate the jews."
  • "chill im a nice person! i just hate everybody"

Twitter users, including civil rights activist DeRay McKesson who was called out by Tay in the first tweet seen above, responded with shock.

Microsoft has blamed Tay's messages on online trolls, per CNNMoney, stating that they orchestrated a "coordinated effort" to fool the program's "commenting skills."

"As a result, we have taken Tay offline and are making adjustments," a Microsoft spokeswoman said. "[Tay] is as much a social and cultural experiment as it is technical."

Meanwhile, some people predict messaging bots will replace mobile apps in the near future, but Microsoft's first effort may slow things down a notch or two. 

Advertisement
Advertisement
Adweek Blog Network