If you ask Google Home, the search giant’s version of the Amazon Echo smart speaker, the Republican Party is a group of Nazis, and former president Barack Obama is planning a coup d’état in the US.
These things are not true, of course, but Google’s virtual assistant does not know any better. It is just parroting what Google search shows when the same questions are asked online.
BBC News’ technology correspondent Rory Cellan-Jones asked earlier today (March 5) whether former US president Obama is planning a coup.
Obama is “in bed with the Communist Chinese” and may “in fact be planning a Communist coup d’etat at the end of his term,” Google Home replied.
Danny Sullivan, the founder of the sites Marketing Land and Search Engine Land, asked Google Home whether Republicans are fascists, and got a definitive sounding fake answer.
If you ask Amazon’s Echo the same two things, by the way, it will reply that it does not understand the question.
The answers are part of a shortcoming with Google’s online search. When a user asks basic facts online like “What’s the capital of France,” “Who’s is in charge of Japan,” or “How tall is Jude Law,” the answer will often include a “Knowledge Graph,” a paragraph or several inside a box at the top of the page that Google generates from solid facts from a range of reputable sources around the web.
But sometimes Google search results pull up a similar-looking box based on the results of a single website, called a “Featured Snippet.” Asking off-the-wall questions means Google is more likely to pull answers for these snippets from off-the-wall sources, as The Outline points out.
Snippets like these seem to have provided the answers for Sullivan and Cellan-Jones’ Homes, and also for plenty of other unexpected questions.
“Who is the king of the United States?” Barack Obama, apparently. “Which president was a member of the KKK?” At least five of them, it seems. Because there hasn’t been a reputable source writing about these topics, Google’s search gets answers where it can find them—which is often only on fake news and conspiracy-minded websites.
In the past, Google has come under fire for search results throwing up incorrect answers, including one denying the Holocaust happened. It was forced to manually fix the problem.
A Google spokesperson told Quartz that it manually fixes problems like this when it’s alerted to them, and has made a change in this case.
Featured Snippets in Search provide an automatic and algorithmic match to a given search query, and the content comes from third-party sites. Unfortunately, there are instances when we feature a site with inappropriate or misleading content. When we are alerted to a Featured Snippet that violates our policies, we work quickly to remove them, which we have done in this instance. We apologize for any offense this may have caused.