The false vaccine debate shows we’re in a golden age of believing whatever we want

Ouch.
Ouch.
Image: Reuters/Brian Snyder
We may earn a commission from links on this page.

As Georgia becomes the latest US state to report a case of measles, many across the land are shaking their heads in disbelief. How can otherwise intelligent, well-intentioned parents choose not to have their children vaccinated? It’s tempting to conclude that the problem is simply a lack of information. Give them the facts and they will see the light. Not so fast.

The parents—all of us, in fact—are susceptible to a quirk of human nature variously called “selective exposure” or “confirmation bias.” Simply put, we gravitate to information that bolsters our convictions (about anything) and reject that which undermines them. This not new. In the 16th century, Francis Bacon observed that “The human understanding when it has once adopted an opinion… draws all things else to support and agree with it.”

Today, the temptation to engage in confirmation bias is even greater since information confirming your particular bias is only a click away. Put another way: if you’re in the business of cherry picking data, this is a golden age, for the cherries are more plentiful than ever, and far easier to pick.

Sometimes, in fact, they pick themselves. Search engines and online retailers like Amazon discern the type of products and information we supposedly like then see to it we’re exposed to nothing else. If you’ve purchased books with a liberal perspective, then Amazon is going to steer you to similar titles and ensure that a conservative tome never graces yours cart.

Facebook and other social media sites, meanwhile, filter our newsfeeds so that we are only exposed to like-minded views. Let’s say you’re liberal but have a lone conservative friend. His posts may not be displayed as prominently since they represent an outlier, and are therefore “discounted” by the algorithm. The net result, says Paul Resnick, a professor at the University of Michigan’s School of Information., is that social media “may intensify not only feelings of loneliness, but also ideological isolation.”

But facts are facts, you say, and in the end truth prevails. Not necessarily. On the Internet, bad information never dies. It’s like one of those Marvel Comics villains that grows in strength every time he’s assaulted by the good guys. Google “vaccines and…” and the first search parameter the algorithm suggests is “vaccines and autism.” There is no evidence of such a link, of course; the single study claiming one has been widely discredited, and its author stripped of his medical license, but for those suspicious of vaccines, that matters little. For them, the connection has been made. Should they need more convincing, they can click on websites like Age of Autism, which maintains there is a connection between vaccines and autism.

Irrational? Perhaps but this is where “motivated reasoning” enters the picture. Motivated reasoning is confirmation bias taken to the next level. We think of reasoning and feeling as occupying wholly separate realms, but they don’t. Rational thinking always contains a strong emotional component, studies show, even if we’re not aware of it. This is especially true when we feel fearful or threatened by new information. Like a cornered animal, we hunker down in our intellectual caves. Psychologist Jonathan Haidt, in his book The Righteous Mind, says our minds act less like scientists, boldly going where the facts lead, and more like press secretaries, stubbornly defending our core positions no matter what.

A case in point is climate change. In one study, researchers asked some 240 people in upstate New York to read a variety of simulated news stories about climate change then asked them whether or not they might support policies that contained climate change. The stories focused on eight fictitious farmers and how they might see their livelihoods affected by rising temperatures. By manipulating the identity of the farmers—making some locals and others foreigners—-the researchers were able to alter people’s resolve to support policy action. Knowledge about climate change, or about science in general, played virtually no role in their decision.

Is there anything we can do about selective exposure, or are we destined to live in intellectual silos? One thing is certain: the approach we’ve tried so far isn’t working. For years, scientists relied on the “deficit model” of communication. People make poor decisions—like not having their children vaccinated—because they simply don’t have the facts. So efforts focused on getting the word out, bombarding people with boatloads of studies and data, confident that this would change minds. It doesn’t, and in fact, can backfire. The more vigorously the scientific community repudiates erroneous information the more fiercely people cling to those beliefs. The truth does not always set us free and, in fact, may tighten our shackles.

The most effective way to correct a falsehood is not with the truth, not that only, but by engaging people as equals, as intellectual partners, even if you find their views Paleolithic. Condescending to someone is a sure way to alienate them, and therefore ensure your message is never heard. Rather than leading with the facts, best to “lead with values so as to give the facts a fighting chance,” says science writer Chris Mooney, writing in Mother Jones.

Another important piece of the puzzle is what psychologists call social distance. The greater the social distance between the messenger and the recipient the less likely the message will stick. Thus, a call to vaccinate your child that comes from the CDC is less effective than one that comes from your local health department—or, better yet, a neighbor or friend.

As for those pesky algorithms, the problem is not that they are too clever but, rather, that they aren’t clever enough. Writing in the journal Daedalus, Paul Resnick argues for more creative code writing. How about an algorithm that “learns” when we are in a curious mood, based on our online behavior, and chooses those times to expose us to more “challenging information.” Or one that highlights stories that attract strange bedfellows, “those that are liked by two clusters of people who do not usually agree with each other.” The possibilities are endless.

“Everyone is entitled to his own opinion,” Daniel Patrick Moynihan famously said, “but not to his own facts.” If only it were so. Many people do feel entitled to their own facts, and that is proving not only frustrating, but dangerous as well.