You’re probably familiar with the bite-sized info cards that Google sometimes throws at you when you input a popular query. For example, asking who’s the President of the US, or how tall is Everest, all return small cards with the info, without the user needing to open up additional websites. There’s a quick and convenient way to get info and they’re also the basis for the responses you get from the Google Assistant. Unfortunately, they’re also a great way to spread fake news.
Users this week discovered how the card-based system is spewing baseless accusations and conspiracy theories when asked certain questions. Google Home users were shocked to hear their mild-manner digital assistant explain that Barack Obama was not only in bed with Chinese communists but was also planning a coup for the end of his term. The same answer came up in the form of an info card on certain mobile queries.
This problem stems from the fact that Google uses algorithms to find answers to queries, and the card system is designed to gather seemingly correct answers from everywhere around the web, without really discerning the validity of the info it’s relaying. On top of that, implementations like Google Home also add another barrier because the user doesn’t actually see the source of the info, instead they just hear the vaguely legit-sounding name of an organization. The company explained:
Featured Snippets in Search provide an automatic and algorithmic match to a given search query, and the content comes from third-party sites. Unfortunately, there are instances when we feature a site with inappropriate or misleading content. When we are alerted to a Featured Snippet that violates our policies, we work quickly to remove them, which we have done in this instance. We apologize for any offense this may have caused.
Google says it has flagged this specific instance now, though this isn’t the first time the company’s system started spreading idiocy. A couple of years ago, the same card-based system insisted the bible could explain all your queries about dinosaurs and how they only lived a few thousand years ago – you know, inside Fred Flintstone’s house.
It’s clear that without some sort of basic editorial process, even an algorithmic one, these types of issues will continue to arise, and given the spread of digital assistants and the difficulty of sourcing correct info, this may be a problem.