The Story of Microsoft's Blunder Makes Chatbot AI on Twitter Become Racist


 Before ChatGPT was busy now, there was an incident with an AI chatbot made by Microsoft in 2016. The chatbot is chaotic and racist. Here's the story.

In 2016, The Verge reported on the blunder of Microsoft's AI chatbot launched on Twitter, namely Tay. The AI chatbot, which is expected to help understand Microsoft's conversations, actually behaves the opposite. Why is that?


As reported by The Verge, it only took less than 24 hours for Twitter to destroy the AI chatbot that was just launched by Microsoft. Initially the chatbot responded well before people started tweeting at the bot with misogynistic, racist and Donald Trump-smelling sentiments.



Tay is like a parrot given internet access, repeating that sentiment to users. Among the 96,000 tweets that exist, the majority of Tay's rants are simply copied from users. Even though most of Tay's tweets have been deleted, this bot doesn't seem to have any fixed beliefs.


Tay seems to support feminism with the lines 'gender equality=feminism' and 'I love feminism now'. But there are also sentences that denigrate feminism with the expressions 'cult' and 'cancer'.




There were also Tay's racist tweets insulting Mexicans. It is known that the tweet was the result of processing data on Donald Trump's remarks that were previously circulating on Twitter.



It's not really clear how Microsoft set up this AI. Microsoft said Tay was created using relevant public data. But after being launched, Tay exited the set corridor.


In response to the crash, Microsoft removed most of Tay's timeline that contained the offensive phrase. How can you teach AI about public data without exposing it to the worst human traits. Microsoft forgot to take precautions against anything regarding this problem.


So what is Tay's fate now? As you can see right now, Tuesday (21/2/2023) the @TayandYou account still exists but is locked. Turns out Tay joined Twitter in 2015.


Even though her account is currently locked, Tay still has a following of more than 100k followers. There is an embedded link but we cannot access the link.

Previous Post Next Post

Contact Form