Microsoft's Chatbot 'Tay' Just Went on a Racist, Misogynistic, Anti-Semitic Tirade

AI-based hate speech shocks social observers

Microsoft, with help from its search engine Bing, created an artificial intelligence messaging bot called "Tay," which it rolled out this week on Twitter, Kik and GroupMe to much fanfare among techies. The chatbot was designed to converse online with Gen Z and millennial consumers while speaking "their language."

But today, the nearly unthinkable occurred: According to several reports, Tay, a "she," began spewing words that were clearly racist, misogynistic and anti-Semitic. Microsoft later shut down the bot, but the damage had already been done. 

AW+

WORK SMARTER - LEARN, GROW AND BE INSPIRED.

Subscribe today!

To Read the Full Story Become an Adweek+ Subscriber

View Subscription Options

Already a member? Sign in