top of page
  • Kyle Chua

Microsoft Gives Users More Control Over Bing’s AI Chatbot by Adding Different Response Modes

Microsoft has been gradually improving Bing's artificial intelligence (AI) after test users reported cases of it exhibiting strange and disturbing behaviour.

Credit: Mashable

One way the software giant thinks it can prevent the chatbot from going off the rails again is to give users control over its personality. Up to 90% of testers should now see what web services chief Mikhail Parakhin calls the tri-toggle, a new feature that lets users choose how Bing might respond to their queries.


The chatbot, for instance, now features a Creative personality that allows it to deliver "original and imaginative" responses, while the Precise personality delivers more straight-to-the-point and direct replies. The Balanced personality, meanwhile, is sort of the middle ground between the two.

Apart from the new tri-toggle, Parakhin recently noted that users should also see some improvements in the chatbot. The chances of it being unresponsive and having "hallucinations" in its answers have been reduced.


Microsoft started temporarily limiting test uses of Bing's chatbot last month after discovering that long chat sessions "confused" it, causing it to have bizarre exchanges with users. A New York Times technology reporter, for example, reported that the AI urged him to leave his wife and be with it instead. Kevin Scott, Microsoft’s Chief Technology Officer, did explain that the odd interactions the chatbot had with users are “part of the learning process”, and help improve it further.


Since that time, however, Microsoft has been lifting the limits it previously imposed on the chatbot. Just last week, the company integrated the technology, which it co-developed with OpenAI, the developers of ChatGPT, on mobile apps and Skype. This allowed those who already had access to the chatbot via their Microsoft accounts to use it on their smartphones. A few days ago, the company also brought the AI to the Windows 11 taskbar.

 
  • Microsoft has been gradually improving Bing's AI after test users reported cases of it exhibiting strange and disturbing behaviour.

  • The software giant has added a new feature called the tri-toggle that lets users choose how Bing might respond to their queries.

  • The chatbot, for instance, now features a Creative personality that allows it to deliver "original and imaginative" responses, while the Precise personality delivers more straight-to-the-point and direct replies.

  • Apart from that, the chances of it being unresponsive and having "hallucinations" in its answers have been reduced.

As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

bottom of page