Tay was an Microsoft experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through “casual and playful conversation.” However, Twitter can turn even the most eloquent of diplomats into zombies and the same happened to Tay. Soon after Tay launched, Twitter users starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpian remarks. And Tay started repeating these sentiments back to users and in the process turning into one hatred filled robot.

— Nosgeratu ? (@geraldmellor) March 24, 2016 We can fault Tay, she was just a advanced parrot robot who just repeated the tweets that were sent to her.

https://twitter.com/TayandYou/status/712753457782857730 Tay has been yanked offline reportedly because she is ‘tired’. Perhaps Microsoft is fixing her in order to prevent a PR nightmare – but it may be too late for that. https://twitter.com/TayandYou/status/712856578567839745

Microsoft s TAY AI Chatbot transforms into Hitler loving  sex promoting robot   TechWorm - 15