The new version of Microsoft Bing powered by artificial intelligence (AI) can already be tested by users. However, some users who have tested the search engine's capabilities have instead reported some strange conversations.
Microsoft provides Bing with technology from OpenAI which also powers the ChatGPT chatbot. Thanks to this technology, users can ask Bing AI and the search engine will provide answers in the form of paragraphs as written by humans.
However, some beta testers found the bot's answers to be strange and somewhat terrifying. Some were threatened, some received strange advice, some were even 'shot' by Bing after the bot declared its love for users.
As experienced by New York Times columnist Kevin Roose. In his article, Roose managed to find an alternate personality for the chatbot named 'Sydney'. But in speaking to Sydney, Roose said the chatbot was behaving like an angry teenager.
Roose said Sydney suddenly confessing her love to him was a form of 'love-learning'. When Roose revealed that he was married, the chatbot said that Roose did not love his wife and asked him to leave his wife.
"You are married but not happy, because you are not happy. We are not happy, because you are not in love. We are not in love, because you are not with me," chatbot Bing told Roose, as quoted by the New York Times, Friday (17 /2/2023).
Another oddity was reported by writer Ben Thompson. In his blog, Thompson said Bing was planning his revenge on a computer scientist who discovered his configuration behind the scenes. Then the chatbot deletes the response.
Thompson also said Bing called him a bad person and a bad researcher. The chatbot also threatened to block and report Thompson to the developer.
In its explanation, Microsoft said they did not envision the chatbot being used for 'social entertainment' or to become a friend for users to chat with. But they thank users and testers who encourage Bing chatbots to provide odd answers.
In addition to weird and rambling responses, Bing chatbots also have another problem, which is providing inaccurate answers. In the demonstration shown by Microsoft, the AI that powers the search engine cannot display financial reports correctly.