Google it

Apparently Google isn't responsible for the digital news echo chamber

A new study shows that people seek out the news they want to read

We may earn a commission from links on this page.
A Google search for "reliable news, please"
Google might not be the problem in your news searches. It could just be you.
Screenshot: Quartz

A new study found that Google’s search algorithm does not disproportionately lead people to click on partisan and unreliable news. Rather, people largely click on the news they seek out—no matter where on the internet they find it.

Researchers from Stanford, Northeastern, and Rutgers universities tracked about 600 participants by having them install a web browser extension that followed their activity across the web during two periods: the run-up to the 2018 and 2020 US elections. They were able to compare participants’ activity on Google’s search engine relative to their behavior elsewhere on the internet, seeing not only what participants clicked on, but what they were shown by Google.


The study, published in the scientific journal Nature, pushes back against public narratives that the algorithms powering the internet’s most popular websites lead people to misinformation and entrap them in so-called filter bubbles and echo chambers.

“People just want to seek out what they want to seek out,” said Ronald Robertson, a postdoctoral fellow at the Stanford Internet Observatory and the lead author of the study. “And they use platforms to fulfill their information needs. As opposed to being a naive actor believing and doing what the platform shows them, they’re actively seeking things out.”


Out of the rabbit hole

As the internet has become more enmeshed in global politics, researchers, critics, and even regulators have become more concerned about the spread of misinformation—and what happens to the news ecosystem when it’s influenced by powerful algorithms that determine what users see and don’t see. Meanwhile, political conspiracy theories like QAnon have been born online with no shortage of reports of internet users falling down the internet “rabbit hole” toward extremism.

But Robertson said that the role of individual agency versus the role of algorithmic culpability hasn’t been well probed by independent researchers, and the studies that do exist largely focus on social media and not search engines. (The authors cite a 2015 study finding individual choice was more important than algorithmic filtering in determining whether people were exposed to ideological perspectives they disagreed with on Facebook—but the authors of that study were researchers at Facebook.)

“Diverse content, diverse choices”

The findings of the new study, the authors wrote, don’t fully vindicate Google’s search engine for disseminating unreliable news. “In some cases, our participants were exposed to highly partisan and unreliable news on Google Search, and past work suggests that even a limited number of such exposures can have substantial negative impacts,” they wrote.


For the Google search study, the researchers relied on an existing database of partisan news sources based on a panel of partisan Twitter users and the web domains they shared. For a credibility metric, they relied in large part on the website NewsGuard, which gives news websites ratings based on their accuracy, quality, and transparency. Robertson called this a limitation of the study, since credibility can vary within a given publication.

Still, the study offers some assurance that we—as internet users and news consumers—have plenty of agency in discovering and reading news. “It’s quite plausible that Google could just say, we know what kind of news websites you frequent, and so we’re going to prioritize them in our search results; and people might be happier that way,” David Lazer, a Northeastern professor who co-authored the study, told a university publication. “But they don’t. They tend to show diverse content, diverse choices.”