Microsoft's Chatbot 'Tay' Just Went on a Racist, Misogynistic, Anti-Semitic Tirade

AI-based hate speech shocks social observers

How can you keep pace with an accelerating marketing ecosystem? Join us at Brandweek Sept. 12–16 in Miami alongside leading CMOs, founders and change makers from GatoradeMarriottAlo YogaCampbell'sUncommon James and more. Book now.

Microsoft, with help from its search engine Bing, created an artificial intelligence messaging bot called "Tay," which it rolled out this week on Twitter, Kik and GroupMe to much fanfare among techies. The chatbot was designed to converse online with Gen Z and millennial consumers while speaking "their language."

AW+

WORK SMARTER - LEARN, GROW AND BE INSPIRED.

Subscribe today!

To Read the Full Story Become an Adweek+ Subscriber

View Subscription Options

Already a member? Sign in