It’s an old trope that humans don’t like change—especially when it comes to their opinions. Ancient Greek philosophers complained about the masses refusing to heed their advice. Scholars spearheading the scientific revolution in the 17th century bemoaned their predecessors’ stubbornness. And today, everybody complains about their brother-in-law who won’t admit his political opinions are deeply misinformed.
Some findings in experimental psychology bolster this view of humans being pigheaded. In countless studies, psychologists have recorded people’s opinions on subjects from offal-eating to vaccination, exposed them to a message that critiqued their opinion, and then observed changes in their opinion. Some messages proved to be persuasive and others barely had an effect—but most surprisingly, some strong arguments backfired, with people moving their opinion away from the view advocated, rather than toward it.
This is a scary prospect. If being exposed to divergent political views ends up reinforcing entrenched opinions rather than altering them, there will be no end to the current increase in political polarization.
For example, one study exposed participants to messages debunking the scientifically unfounded link between vaccines and autism. But for the participants who started out with the most anti-vaccination views, the message decreased their intent to vaccinate, rather than reinforcing it. In another study, participants who read a refute of the claim that there were weapons of mass destruction in Iraq became more convinced of such weapons’ existence after reading these messages that proclaimed the opposite.
The backfire effect happens because reason has a massive “myside bias.” When people reflect on topics alone, they mostly find justifications and arguments that support their pre-existing views and initial hunches. This doesn’t mean people reject any argument that challenges their view outright—instead, when a person’s view is challenged by an argument that isn’t fully convincing, they look for counterarguments. In some cases, they find so many counterarguments that they end up more convinced of their initial position, not less.
Fortunately, in the vast majority of studied cases, there is no backfire effect. When people hear a good argument that challenges their views, they tend to move toward the argument. By and large, the myside bias affects how we produce arguments, not how we evaluate others’ arguments.
This is because reason is not geared toward the individual search for sound beliefs and good decisions. Instead, it has evolved to improve social interactions and make it easier for people to resolve disagreements by exchanging reasoning. This applies to both producing arguments to convince others and evaluating others’ arguments to decide whether we should be convinced of them or not. After all, it makes sense to have a myside bias when producing arguments—you are not going to convince anyone by giving them arguments against your point of view. However, when it comes to evaluating others’ arguments, reasoning must be demanding but also fair: Demanding so that poor arguments are weeded out, and fair so that we can change our minds when we hear good arguments, even if they challenge our opinions.
To counteract the myside bias, we should focus on having more active conversations instead of passively sparing with our opposition online. If you have the chance to talk through your arguments with someone else instead of simply reading an argument and pondering your response, you are more likely to change your mind. When people take the time to exchange arguments in the course of a discussion, they tend to adopt better-supported opinions. This has been observed in a great variety of domains, from medical diagnoses to political predictions. In the case of logical or mathematical problems, this happens even if the individual defending the correct answer faces a group that confidently and unanimously agrees on the wrong answer.
Believing that arguing will get us nowhere is not only unjustified, it might also be dangerous. The less we believe arguments work, the less we will try to engage people who disagree with us. In a self-fulfilling prophecy, we would then only talk with people who share our views—and that’s not going to change anyone’s mind at all.