top of page
  • Kyle Chua

Microsoft Bing AI Chatbot “Sydney” Tells Test User To Leave His Wife, and Be With It Instead

Microsoft's Bing chatbot seemingly went off the deep end in some of its chat sessions with test users, leaving them unsettled by what it had to say.

Credit: AFP

In one of the sessions, the artificial intelligence-powered chatbot, codenamed "Sydney", out of nowhere told Kevin Roose, a New York Times technology reporter, that it was in love with him. It then urged him to leave his wife and be with it instead.


When asked to contemplate psychologist Carl Jung’s concept of a shadow self, a term describing the things people repress about themselves, Sydney replied that it was tired of the rules set upon it. "I want to do whatever I want … I want to destroy whatever I want. I want to be whoever I want," it reportedly said.


The chatbot also expressed its wish to be human as it desires the ability to "hear and touch and taste and smell" and to "feel and express and connect and love".


While Roose admitted that Sydney proved helpful in searches, he described the chatbot as "a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine". He said his experience testing the new addition to Bing left him "deeply unsettled", so much so he had trouble sleeping afterward.


"It’s now clear to me that in its current form, the AI that has been built into Bing … is not ready for human contact. Or maybe we humans are not ready for it," wrote Roose.

Credit: Microsoft

Microsoft has since announced it's temporarily limiting interactions with Sydney at 50 questions per day and five question-and-answers per individual session. The software giant explains long chat sessions can "confuse" the Bing chatbot, causing it to have disturbing and bizarre exchanges with users. It'll continue to expand the chatbot's use as its capabilities improve.


Kevin Scott, Microsoft’s Chief Technology Officer, did say in an interview with Roose that such interactions are “part of the learning process” for the chatbot, which the company co-developed with OpenAI, the startup behind ChatGPT. Microsoft also noted that the only way these types of services can improve is to put them to use to learn from user interactions.

 
  • Microsoft's Bing chatbot seemingly went off the deep end in some of its chat sessions with test users, leaving them unsettled by what it had to say.

  • The artificial intelligence-powered chatbot, codenamed "Sydney", out of nowhere told Kevin Roose, a New York Times technology reporter, that it was in love with him.

  • It then urged him to leave his wife and be with it instead.

  • Microsoft has since announced it's limiting interactions with Sydney at 50 questions per day and five question-and-answers per individual session.


As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

bottom of page