Effects of Bing Chatbot: Narcissistic, Seductive and Manipulative

 


The first impressions of Bing AI chatbot users are not all positive. There have been many negative incidents that have made Microsoft's ambitions look like an own goal.

How come? Reported by The Conversation, Monday (20/2/2023) Microsoft is considered to make its users nervous. Their chatbot is more likely to threaten users, instead of having ongoing conversations.


AI-based search chatbots whose true goal is to make it easy for users. Microsoft is now at the forefront of search on chatbots. There is a collaboration worth USD 10 billion (IDR 151.6 trillion) with OpenAI which developed the phenomenal ChatGPT.



The collaboration between Microsoft and ChatGPT gave birth to Sydney, a chatbot installed on the Bing search site. Within 48 hours, 1 million users signed up for access to this Chatbot.


However, there were several incidents that made this ambitious Microsoft project viewed sideways. Bing reportedly threatened to kill a professor at the Australian National University, seduced a married New York Times journalist and bickered with users into thinking it was still 2022.

Advertisements



This chatbot seems to respond with what is possible, not what is right. Actually he was given a fence so as not to access offensive information, but apparently the fence is easy to jump. In fact, the chatbot recognizes the name Sydney, even though it's against the rules of the program.


Ironically, Microsoft has not yet responded to this. Microsoft still enjoys attention, because Sydney in 169 countries is considered quite good by its users.


Experts are now worried that chatbots like Sydney, if not fixed, will access hoax information, conspiracy theories and misinformation on the internet. The worry is that the wrong information is accepted by the AI chatbot as truth.


New technology should not harm the users. You also have to be critical when accessing a chatbot, that even this kind of AI is not without faults.

Previous Post Next Post

Contact Form