Social media loves echo chambers, but the human brain helps create them

The best way to pop a filter bubble is by exposing ourselves to people and ideas that challenge us.
The best way to pop a filter bubble is by exposing ourselves to people and ideas that challenge us.
Image: Reuters/David Becker
We may earn a commission from links on this page.

In the week since Donald Trump’s victory, debate has raged over the role played by social media in the US election. Both Trump and his campaign’s digital director have partially credited social networks for his win, and Mark Zuckerberg has been under huge pressure to tackle the proliferation of fake stories on Facebook. On Wednesday, a BuzzFeed analysis found that fake news outperformed real news in the run-up to Election Day, and Oxford Dictionaries declared its word of the year to be “post-truth.” All in all, a tough time for objectivity.

The essence of the problem isn’t new: Our brains are prone to turning complex ideas into easy-to-understand tidbits, and social media capitalizes on that. But today’s information overload seems to encourage our worst impulses: tribalism, insulation, and favoring the quick, digestible version of “truth” over claims that require due diligence.

“The rate that memes and false news are created, and the rate at which they’re personalized…that [is something] we’ve never seen before,” says Morteza Dehghani, an assistant professor of psychology and computer science at the Brain and Creativity Institute (BCI) at the University of Southern California.

A bubble of our own making

“[Algorithms] exploit this situation, but it’s a human-driven approach.”

That’s Walter Quattrociocchi, a computer scientist at the IMT Institute of Advanced Studies in Italy. Quattrociocchi has published a series of papers (awaiting peer-review) that analyze the rigidity of “echo chambers.” His findings suggest that people, not social networks, have been their driving force. We commonly sort ourselves into rigidly like-minded groups—and stay there.

The term “filter bubble” was coined by internet activist Eli Pariser in 2011, but the concept is ancient: Before the internet, people’s worldviews were restricted by geography. In online filter bubbles, algorithms show us links and stories based on a digital profile of our likes and dislikes. It’s an effect publishers are willing to capitalize on and, as Quattrociocchi says, we readily contribute to the process by having such rigid opinions.

“We the users impose echo chambers on social media,” says Dehghani, “because of the tribal nature we have.”

People are starting to push back on these systems—a clip of a BBC documentary that highlighted the danger of filter bubbles was posted to Reddit in the days after Trump’s victory, where it garnered nearly 4,500 comments. Google and Facebook have also announced plans to ban fake-news sites from participating in their ad networks. But technology is only half the battle.

Minds for memes

Viral social media content goes viral for a reason: It appeals to the brain’s susceptibility to heuristics, or mental shortcuts that package complex information into simple bits that are easier to process. Many platforms, like Facebook, also satisfy our natural inclination to treat information shared by people we know with greater credulity, regardless of the source.

“We engage and share stuff that resonates with us in some way,” says Ryan Milner, a communication expert at the College of Charleston in South Carolina. “But the thing about resonance is, it doesn’t have to be tied to actual reality.”

It’s not hard to imagine the consequences of a world where resonance supersedes truth. A blog post from computational social science PhD student Adam Elkus suggests that as memes become a dominant method of communication, this disconnect will ultimately change knowledge itself.

“Crucially, memes and other forms of digital culture such as political bots now rival traditional forms of knowledge and culture and in some ways replace or supersede them,” Elkus wrote. “In 2016, the memes won.”

Julia Shaw, a psychologist and author of The Memory Illusion, says memes “generally do win.” We’re exposed to so much information that digesting and remembering only the most succinct and appealing snippet—say, a misleading headline—becomes second-nature. Some politicians are already exploiting that dissonance by promoting an intentionally broad range of ideas, even including conflicting ones, so that supporters can cherry-pick the messages they prefer.

“By having a campaign that says lots of different and sometimes contradictory things, you give people the ability to only remember and care about things that match their worldview,” Shaw says.

Social solutions

There are major obstacles to popping our filter bubbles: One of Quattrociocchi’s recent papers found that fact-checking is effectively ignored by people who have been won over by a powerful message. It’s also impossible to “quantify the world into true or false”, he says.

The election backlash has many people thinking about how technology could at least be kept from exacerbating the problem. Quattrociocchi suggests a new system of information-dissemination that would reverse our reductive compulsion to simplify everything into resonant nuggets. Harvard professor and digital media expert Jonathan Zittrain says existing social networks could be redesigned to improve interactions between people of differing views, and a Guardian experiment in which Facebook users swapped feeds suggests there are interesting possibilities there. More immediately, a group of students at Princeton University created a Chrome browser extension that algorithmically categorizes fake news. That approach has its risks, though: Sweeping judgements about content made without thoughtful human consideration is what got us into this mess in the first place.

Twitter and Facebook can’t unravel human nature, though. The best way to exit an echo chamber or throw off the blinders of prejudice is still by willingly exposing ourselves to people and ideas that challenge us.

“Until we change, until me as a user goes and seeks diverse opinions,” says Dehghani, “we shouldn’t expect Facebook to come in and force a particular viewpoint.”