When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Microsoft News's AI journalist confused mixed-race Little Mix singers

In May 2020 Microsoft took a decision to remove dozens of editors working to curate MSN stories as the company moved towards an AI-driven model. However, the decision to replace people with AI to curate news stories has backfired as Microsoft's AI editor confused mixed-race Little Mix singers in a recent news story.

The recent anti-racism movement has seen support from celebrities around the world including singer Jade Thirlwall. While curating a story about Jade's personal reflection as a child, Microsoft's AI added a photo of her fellow band member Leigh-Anne Pinnock.

Jade attended the Black Lives Matter movement in London last week and on Friday she criticized MSN saying she was sick of being confused with Leigh-Anne Pinnock. Jade posted on Instagram saying, “This (expletive) happens to @leighannepinnock and I ALL THE TIME that it’s become a running joke. It offends me that you couldn’t differentiate the two women of colour out of four members of a group … DO BETTER!" According to The Guardian's sources, the story about Jade's personal reflection and how she faced racism in school was curated by Microsoft's new AI editor that added the wrong featured image.

A spokesperson for the company told The Guardian:

As soon as we became aware of this issue, we immediately took action to resolve it and have replaced the incorrect image.

The Guardian also reported that Microsoft had instructed the remaining MSN editors to be on a lookout for negative articles surrounding the mistake and scrub those articles from MSN if AI decides to pick and publish the articles. Microsoft even warned the editors that the AI may overrule them and publish the stories again after they have been deleted from MSN by human editors.

An MSN staff member confirmed to The Guardian that Microsoft has been worried about the reputation of its AI editor. The person said:

With all the anti-racism protests at the moment, now is not the time to be making mistakes.

This is not the first time that Microsoft had to apologize because of a mistake made by an AI. Back in 2016, Microsoft introduced Tay, a chatbot for Twitter users. The bot was taken offline just two days after the launch because Twitter turned it into a racist nutjob. Microsoft later apologized saying "We're deeply sorry for unintended offensive tweets" from the Tay chatbot.

Source: The Guardian

Report a problem with article
Next Article

The Windows 10 May 2020 Update is rolling out to more people today

Previous Article

Files UWP is a new modern file explorer app for Windows 10

Join the conversation!

Login or Sign Up to read and post a comment.

3 Comments - Add comment