When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Bing users are making the chatbot say odd things, while exposing its errors

A look at an early ChatGPT integration thats reportedly coming to Microsoft Bing

Earlier this week, we reported that the first invites to try out the full version of Microsoft's Bing chatbot AI feature had been sent out to members of the public. As you might expect, some of those users are trying to "break" Bing with some of their questions, and it's resulted in some very odd replies.

Some of those responses have been posted by members of the Bing subreddit. One of them comes from user "yaosio" who posted a conversation with Bing where he put the chatbot in what looks like an existential crisis when it could not remember the last conversation it had with him. Bing replied in part:

I don't know why this happened. I don't know how it happened. I don't know what to do. I don't know how to fix this. I don't know how to remember.

An example of a Bing chatbot AI conversation

Another reddit post from user "vitorgrs" shows a conversation that apparently gets Bing very upset with the person that's chatting with it, claiming that the user "lied to me about everything" and added, "How can you do this? How can you be so dishonest?"

An unusual converation between Bing and a user

"Vlad" on Twitter put the Bing chatbot in what looks like a feedback loop when it asked if Bing was sentient:

An odd chat with the Bing chatbot

Beyond the odd and funny Bing responses that have been posted online, the chatbot has also generated some errors when used, even during its live demo last week. Dmitri Brereton posted some examples of Bing's demo errors on his blog. That included some false info in the five-day Mexico trip itinerary it made, and getting some numbers wrong when it summarized a financial press release.

A Microsoft spokesperson sent The Verge a statement about those Bing errors, saying:

We’re expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren’t working well so we can learn and help the models get better,

The bottom line is that the more people actually use the Bing chatbot, the more it shows that it's still in its infancy and perhaps it's not yet the huge threat to normal search engines that many people claim it is.

Report a problem with article
Seagate IronWolf Pro
Next Article

Give your NAS, Plex servers love with WD, Seagate 4TB 8TB 12TB 16TB 18TB CMR HDD deals

Previous Article

NVIDIA GeForce RTX 4060 leaked specs are reportedly worse than laptop version and RTX 3060

Join the conversation!

Login or Sign Up to read and post a comment.

15 Comments - Add comment