
Tay (chatbot) - Wikipedia
Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1] .
Microsoft shuts down AI chatbot after it turned into a Nazi - CBS News
Mar 24, 2016 · Today, Microsoft had to shut Tay down because the bot started spewing a series of lewd and racist tweets. Tay was set up with a young, female persona that Microsoft's AI programmers...
Tay: Microsoft issues apology over racist chatbot fiasco
Mar 25, 2016 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of nurture,...
Why Microsoft's 'Tay' AI bot went wrong - TechRepublic
Mar 24, 2016 · Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong.
Twitter taught Microsoft’s AI chatbot to be a racist asshole in …
Mar 24, 2016 · Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in “conversational understanding.” The more you chat with Tay, said Microsoft, the smarter it gets,...
Microsoft terminates its Tay AI chatbot after she turns into a …
Mar 24, 2016 · Microsoft has been forced to dunk Tay, its millennial-mimicking chatbot, into a vat of molten steel. The company has terminated her after the bot started tweeting abuse at people and went...
What Happened to Microsoft’s Tay AI Chatbot? - Daily Wireless
Mar 7, 2020 · Tay, which is an acronym for “Thinking About You”, is Microsoft Corporation’s “teen” artificial intelligence chatterbot that’s designed to learn and interact with people on its own. Originally, it was designed to mimic the language pattern of a 19-year-old American girl before it was released via Twitter on March 23, 2016.
Microsoft apologizes after AI teen Tay misbehaves - CNET
Mar 25, 2016 · On Friday, the Redmond, Washington, company took responsibility for a string of racist and sexist tweets sent by Tay, the artificial-intelligence chatbot that is the offspring of Microsoft's...
Chatbot Tay: Story of a PR Disaster - ubisend
Chatbot Tay was a disaster. Read its journey from human-loving friendly machine to super-evil racist robot, all in less than 16 hours. Discover what NOT to do.
Learning from Tay’s introduction - The Official Microsoft Blog
Mar 25, 2016 · Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question. As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups.