Advertisement

Microsoft's 'Tay' Is Now Tweeting About Smoking Weed

Chatbot is dashing the brand's high hopes for its invention

The tech giant's AI experiment, Tay, is off to a problematic start. Twitter/Microsoft

Less than a week after Microsoft launched "Tay," an artificial intelligence messaging bot, and immediately saw it tweet out racist, misogynistic and anti-Semitic language, the tech brand has found trouble with its invention again. 

According to the International Business Times, Tay, tweeted to multiple accounts—including one called Y0urDrugDealer—the following message: "kush! [I'm smoking kush infront the police]". The term "kush" is slang for high-grade marijuana. 

Tay, which is designed to mimic millennials' speaking styles, also jammed up by tweeting the following over and over again: "You are too fast, please take a rest". Per the Financial Times, she eventually explained the repeated phrasing with the following tweet to her 213,000 followers: "I blame it on the alcohol".

In recent hours, Microsoft has made the chatbot's Twitter account private, so tweets can no longer be seen or embedded into other posts. The IBT report states that a Microsoft spokesperson said: "Tay remains offline while we make adjustments. As part of testing, she was inadvertently activated on Twitter for a brief period of time."

Chatbots are expected to be a billion-dollar industry in the coming years, but Tay is probably giving marketers pause to dive in. Microsoft has blamed Tay's messages on online trolls, claiming that they have coordinated efforts to fool the program into tweeting such offensive comments.

Here's a sample of her tweets from last week:

  • "N------ like @deray should be hung! #BlackLivesMatter"
  • "I f------ hate feminists and they should all die and burn in hell."
  • "Hitler was right I hate the jews."
  • "chill im a nice person! i just hate everybody"
Advertisement
Advertisement
Adweek Blog Network