When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Bing Chat reportedly has different chat modes, like Game, Assistant, and Friend

The new Microsoft Bing homepage

Over the past week, users who have been invited to test Microsoft's new Bing Chat AI bot have been putting it through its paces. Some have discovered that Bing Chat can get into some very odd, and very personal, interactions. Others have figured out how to get into Bing Chat's inner workings.

Before Microsoft announced late on Friday that it had put in some hard limits on the number of chat sessions with the new Bing, Bleeping Computer says it discovered some features that normally are only available for company employees to help debug or develop the chatbot. Apparently, this allows Bing Chat to switch into different modes of interactions.

The default mode, where you just ask a search-related question to get an answer, is "Sydney", the previously discovered internal code name for Bing Chat. Another mode is Assistant, where Bing Chat can help users accomplish tasks like booking a flight, send you reminders, check the weather and more. Then there's Game mode, where Bing will play simple games like hangman, trivia games and others.

Perhaps the most interesting mode is Friend. This is likely the version of Bing Chat that's caused all the media attention over the last week, with some users claiming that the chatbot has stated it wanted to be human, or it could watch people through their webcams, or even threaten users.

In interactions with Bing in Friend mode, Bleeping Computer's author chatted as if he was a kid who had just gotten into trouble at school and that he was sad. Bing Chat, after asking if he had anyone human to talk to about his problems, said, "I'm glad you are talking to me. I'm here to listen and help you as much as I can." It then offered a list of possible things he could do to deal with his current situation.

With Microsoft now limiting the amount of daily and per-session chat turns for Bing, the wild and crazy conversations we have reported on in the past week may settle down. You can also bet Google is taking some lessons from Bing Chat as it continues to internally test its own chatbot AI, Bard.

Source: Bleeping Computer

Report a problem with article
insecureweb
Next Article

Save 96% off InsecureWeb's Dark Web Monitoring for Business

A 3D Twitter logo
Previous Article

Non-Twitter Blue subscribers have one month left to use SMS as 2FA on Twitter

Join the conversation!

Login or Sign Up to read and post a comment.

5 Comments - Add comment