
Tay (chatbot) - Wikipedia
Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and …
Here are some of the tweets that got Microsoft’s AI Tay in trouble
Mar 25, 2016 · Microsoft's AI chatbot Tay was only a few hours old, and humans had already corrupted it into a machine that cheerfully spewed racist, sexist and otherwise hateful comments.
Tay: Microsoft issues apology over racist chatbot fiasco
Mar 25, 2016 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result …
Twitter taught Microsoft’s AI chatbot to be a racist asshole in …
Mar 24, 2016 · Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, …
Microsoft shuts down AI chatbot after it turned into a Nazi - CBS News
Mar 24, 2016 · Today, Microsoft had to shut Tay down because the bot started spewing a series of lewd and racist tweets. Tay was set up with a young, female persona that Microsoft's AI …
Microsoft chatbot is taught to swear on Twitter - BBC News
Mar 24, 2016 · After hours of unfettered tweeting from Tay, Microsoft appeared to be less chilled than its teenage AI. Followers questioned why some of her tweets appeared to be being …
Microsoft's AI Twitter bot goes dark after racist, sexist tweets
Mar 24, 2016 · Tay, Microsoft Corp's <MSFT.O> so-called chatbot that uses artificial intelligence to engage with millennials on Twitter <TWTR.N>, lasted less than a day before it was hobbled …
After racist tweets, Microsoft muzzles teen chat bot Tay - CNN …
Mar 24, 2016 · Tay, the company's online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft shut Tay down around …
Microsoft apologizes after AI teen Tay misbehaves - CNET
Mar 25, 2016 · On Friday, the Redmond, Washington, company took responsibility for a string of racist and sexist tweets sent by Tay, the artificial-intelligence chatbot that is the offspring of …
What Happened to Microsoft’s Tay AI Chatbot? - Daily Wireless
Mar 7, 2020 · Before it was shut down 16 hours after released, TayTweets (Tay’s Twitter name) or @TayandYou, started smooth, just like a 19-year-old teen American girl and started …