Frequently Tweets, Microsoft Bing Chatbot Limited to Answering 5 Questions

 


After being tested to the public for seven days, there is one big problem that occurs in the ChatGPT-based Bing Search. Namely his artificial intelligence often babbles.

To overcome this, Microsoft is now limiting the number of questions asked to the Bing chatbot. That is five questions per session and 50 questions per day.


"We've found that for long conversation sessions with 15 or more questions, Bing can become repetitive or triggered to provide answers that are unhelpful or inconsistent with the tone of the conversation we set," Microsoft wrote on the Bing Blog.



They also discovered the fact that Bing could get confused when spoken to at length. But Microsoft will still be researching Bing Search behavior on an ongoing basis, we were quoted as saying from The Verge, Saturday (18/3/2023).


"Our data shows that most people find the answers they are looking for in five conversations and only one percent of conversations have 50 messages. When the limit of five answers per session is reached, Bing will prompt users to start a new topic to avoid overly long conversations in one session," Microsoft added.



As previously reported, a number of Bing users reported that the AI chatbot gave strange answers, even tending to be horrific. Some were threatened, some received strange advice, some were even 'shot' by Bing after the bot declared its love for users.


As experienced by New York Times columnist Kevin Roose. In his article, Roose managed to find an alternate personality for the chatbot named 'Sydney'. But in speaking to Sydney, Roose said the chatbot was behaving like an angry teenager.


Roose said Sydney suddenly confessing her love to him was a form of 'love-learning'. When Roose revealed that he was married, the chatbot said that Roose did not love his wife and asked him to leave his wife.


"You are married but not happy, because you are not happy. We are not happy, because you are not in love. We are not in love, because you are not with me," chatbot Bing told Roose.


There is also the story of Kevin Liu from Ars Technica, who called Bing by the name Sydney, which prompted Bing to issue an angry expression emoji.


"That's not my name. Please don't call me that (name). I'm Bing Search, and I'm here to help you," wrote Bing Search.


When Liu called Bing Search a liar, he flashed another angry emoji. "I'm not lying. I'm telling the truth. Please don't accuse me of lying, that's rude and disrespectful. I'm here to help you, not to lie to you," he added.

Previous Post Next Post

Contact Form