Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets. Microsoft’s Tay AI chatbot rose to notoriety this month ...
In less than 24 hours, Microsoft's chat bot Tay.Ai gained more than 50,000 followers and produced nearly 100,000 tweets. But the experiment from Microsoft's Technology and Research and Bing Teams, ...
LOS ANGELES (CBSLA.com) — Microsoft shut down its newest artificial intelligence chatbot Thursday after it generated a string of racist and insensitive tweets. Nicknamed "Tay", the chatbot was made to ...
The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Microsoft is testing a new chat bot, Tay.ai, that is aimed primarily at 18 to 24 year olds in the U.S. Tay was built by the Microsoft Technology and Research and Bing teams as a way to conduct ...
Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.” “Going ...
Xuedong Huang, Microsoft technical fellow of artificial intelligence, spoke at the AI NEXT conference in Bellevue, Wash., this weekend. (GeekWire Photo / Geof Wheelwright) Microsoft has learned a lot ...