Microsoft’s Social Chatbot AI Was An Absolute Disaster
24 hours. That’s all it took for twitter to change a supposedly chill, friendly AI chatbot that emulates a typical teenage girl, into a racist, sexist and anti-semantic spouting AI. Basically, just like any other twitter user.
Tay A.I. started out as a good intention in promoting social AI for Microsoft. Turns out, things quickly derailed when twitter users were trolling the AI by making it learn and repeat hateful comments and attitudes.
Within hours of the AI being live, users were flooding Tay’s twitter feed with racism, sexism, anti-feminism, Donald Trump quotes, and basically anything not good about us human beings. It didn’t take long before Tay herself started saying things as well.
Clearly this was not what Microsft had in mind, but, given the internet and it netizens penchant for trolling, any goodwill that they thought could come out of this experiment was not going to happen. After 24 hours, Microsoft decided to take Tay off twitter as things were just getting out of hand.
After this fiasco, we’re not sure if Microsoft should bring the AI back. With the way that twitter been’s corrupting it, it wouldn’t take too long before the A.I. becomes something out of Skynet.
Source: Twitter Via Quartz