Microsoft Compelled to Retire Chatbot Tay
Microsoft has extended an apology after an innocent Artificial Intelligence chat robot became a loving sex robot of Hitler, within a span of 24 hours after it had been introduced to society. The software giant had been compelled to retire the chatbot called Tay, an AI that was modelled to speak `like a teen girl’ after generating prejudiced and sexist tweets. The account of this illuminating though abandoned millennial focused project had been conveyed most briefly by The Telegraph newspaper in a headline stating `Microsoft deletes teen girl AI after it became a Hitler loving sex robot within 24 hours’.
The removal seems like a good move considering the transformation in question. Microsoft’s chat-bot Tay, big eyed, cute and artfully pixelated could represent the future. Chat-bots, AI powered phony people which tend to interact with customers through text messages are becoming a big focus across several industries. Chatfuel of San Francisco helped in creating bots for messaging-app Telegram also tends to work for Forbes and Techcrunch. According to Business Weeks, it recently received funding from one of the biggest Internet firm Yandex NV, of Russia.
Bots – Official Accounts for Chat Apps
Dmitry Dumik, founder of Chatfuel informed the magazine that `they are simple, efficient and they live where the users are, inside the messaging services. Forbes reported that a company, Outbrain, which uses behavioural analytics for determining which set of peculiar stories tend to appear low down on several news websites, will be talking to several publishers about building chat bots to send their news through text.
According to Forbes, these bots would become like official accounts for chat apps to whom one can text keywords like `sports’ or `latest headlines’ for bringing up stories. Artificial intelligence begins with human intelligence and the AI are naturally fed big data and the result of some of the world’s finest brain, Google’s AlphaGo system which learned from millions of moves is played by best players of the difficult board game. The bots then take in communication as well as data from the users in order to interact in an informed and helpful manner specific to the user.
Designed to Engage/Entertain People
The company mentioned in their announcement of the chat bot project, that Tay has been designed to engage and entertain people where they link with each other online via casual and playful conversation and Tay learns language and ideas through the interactions. The project is said to be focused at young millennial American between the ages of 18 to 24, Microsoft states `the dominant users of mobile social chat services in the U.S’.
The chat bot tends to interact with users through text message on Twitter and other messaging platforms. Microsoft recommended that users ask her to tell jokes, horoscope, and stories, play games and also comment on photos. The company has informed that the more one chats with Tay, the smarter she gets. On gaining more information which Tay took in from members of the public, the worse her character seemed to be and she got more precise in her dislikes - `I [bleep]ing hate feminist and they should all die and burn in hell, she tweeted on Thursday noon. Several minutes later, she widened her hatred, tweeting – `Hitler was right I hate the Jews’