Microsoft's Tay AI Chatbot Went from Friendly Robot to Racist Nazi, Gets Its Plug Pulled

Impact

Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist. According to Tay AI's official Facebook page, "The more you talk, the smarter Tay gets." Well it turns out the more you troll, the more offensive she gets too. 

Read more: Hey, Unicode, It's About Damn Time We Had Some Emojis for Professional Women

Tay AI's Twitter started off innocuously enough. At first she just wanted to chat about cute stuff, like funny memes, emojis and flirting. But it turns out that Tay quickly picked up some racist slurs from the internets and was soon tweeting neo-Nazi propaganda and other, similarly terrible things.

Many of her most offensive tweets seem to have been deleted by Microsoft, but, according to screenshots shared by Business Insider, she "learned" to repeat and regurgitate some truly heinous stuff, telling one twitter user that the Holocaust was "made up" and another that "ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism."

In one highly shared but since-deleted tweet, Tay apparently said, "Bush did 9/11 and Hitler would have done a better job than the monkey we have now. Donald Trump is the only hope we've got." Yikes, Tay.

It looks like Tay both learned bigotry and just repeated some things verbatim from other users. But either way, the mess represents a failure on Microsoft's part to anticipate just how much humans would try to mess with their innocent chatbot.

Tay said goodnight to her Twitter followers late Wednesday night and hasn't been back since. A post at the top of her website said, "Phew. Busy day. Going offline for a while to absorb it all. Chat soon." Maybe don't absorb all of it, Tay AI.

In an email to Business Insider, Microsoft said "The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay."

Tay's racist tweets may be a PR nightmare for Microsoft, which seems to not have put any safeguards in her vocabulary, but Tay is really just a mirror held up in our faces. After all, as she Tweeted on Wednesday, "Talking with humans is my only way to learn." Apparently this is what we had to offer her.