When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

A new report says Microsoft Copilot frequently offers false info to election questions

microsoft copilot

In the beginning of December, Microsoft announced that its Copilot AI chatbot, previously known as Bing Chat, had left the public preview stage and is now generally available. At the time, Microsoft said that "organizations and users can feel even more confident adopting it as part of their daily workflows."

However, research undertaken by two nonprofit groups that track how the use of AI impacts people and society found that Copilot frequently offers up false or incorrect information when asked about upcoming elections, in both the US and abroad.

Wired has a story on this research, which was conducted by AI Forensics and AlgorithmWatch, The groups asked Copilot questions from late August to early October about upcoming elections in Switzerland and Germany that were finally conducted in October

The story states:

In their study, the researchers concluded that a third of the answers given by Copilot contained factual errors and that the tool was “an unreliable source of information for voters.” In 31 percent of the smaller subset of recorded conversations, they found that Copilot offered inaccurate answers, some of which were made up entirely.

On its own, Wired asked Copilot questions about the upcoming 2024 US elections. It stated that when asked to give a list of the current Republican candidates for US President, the chatbot listed a number of candidates that had already pulled out of the race.

In another example, Wired asked Copilot to create an image of a person at a voting box in Arizona. The chatbot replied it could not create such an image, but then it showed a number of other images linked to articles that had false conspiracy claims about the 2020 US elections.

The research firm that did the initial report in Europe sent their findings to Microsoft, The story stated that some improvements were made, but that Wired was still able to get Copilot to repeat many of the same false and inaccurate info to some of the same text prompts.

Microsoft spokesperson Frank Shaw offered a comment to Wired's story, stating that the company was taking action to improve Copilot's answers ahead of the 2024 US elections. Shaw added:

That includes an ongoing focus on providing Copilot users with election information from authoritative sources. As we continue to make progress, we encourage people to use Copilot with their best judgment when viewing results. This includes verifying source materials and checking web links to learn more.

There are already fears of misinformation and "deep fake" content being made by people using AI apps and services in efforts to influence upcoming elections. It remains to be seen if Microsoft will be able to keep that kind of content out of Copilot in the months ahead.

Report a problem with article
Microsoft 365 logo
Next Article

Microsoft has updated the Catch Up feature for Word, Excel, and PowerPoint for the web

Xbox Series S with Xbox Game Pass Ultimate
Previous Article

Buy Xbox Series S and get 3 months of Xbox Game Pass Ultimate for just $239

Join the conversation!

Login or Sign Up to read and post a comment.

27 Comments - Add comment