Oh, Tay! Microsoft Corporation Apologizes for Racist, Sexist Chatbot Tweets

Advertisement

Microsoft Corporation (MSFT) has apologized for its AI chat bot Tay’s Tweets.

Microsoft, Tay, ChatbotTay was launched on Twitter Inc (TWTR) on Wednesday and had to be suspended after it started sending out racists and sexists Tweets, which have now been deleted by Microsoft.

Microsoft said that it had tested Tay in small settings and felt that the AI was ready to communicate with Twitter users. However, it said that an attack on the chatbot resulted in vulnerabilities being discovered in the AI.

The tech company said that it will continue testing Tay in private and that it will relaunch the AI once if feels the vulnerabilities have been fixed.

More From InvestorPlace:


Article printed from InvestorPlace Media, https://investorplace.com/2016/03/microsoft-tay-chatbot/.

©2024 InvestorPlace Media, LLC