To survive in an increasingly unpredictable world, we need to train our brains to embrace uncertainty

Don’t be scared of what’s just over the edge.
Don’t be scared of what’s just over the edge.
Image: AP Photo/Tim Hales
We may earn a commission from links on this page.

We don’t like experts to express uncertainty. Imagine a politician running for president who promises “I’ll try to make good decisions most of the time. Our policies will probably improve the economy.” Would you vote for them? Probably not. We also want doctors devoid of any doubt over treatments and scientists with perfect foresight about climate change variables. That’s because we believe that leaders and authorities are supposed to know and tell us exactly what’s going to happen, thereby absorbing our uncertainty.

But life is rife with risks. Misperceiving and underestimating these risks can lead to vital mistakes. Therefore, to make well-informed decisions, we need to become comfortable with uncertainty.

A false sense of certainty

The world is complex, and uncertainty is guaranteed. However, multiple factors can make things seem more certain than they actually are. We need to identify and fight against these false markers, even when it makes us uncomfortable.

Scientific findings: One is the scientific community. In a competitive environment where scholars either publish or perish, studies that claim extraordinary insights have a better chance of being published and getting publicity. As a result, many findings cannot be generalized or even replicated.

Scientific consensus requires time to achieve, and many individual results are bound to be contradictory: A recent report by Vox explained how common food items such wine, corn, and coffee have been found to simultaneously cause and prevent cancer. Relying on the results of one, stand-alone study is never enough to give us a definitive answer.

Big data: Then there is the up-and-coming art and science of big data. With faster and more diverse information comes a heightened sensation of understanding and predictability. But this is a false assumption; even slightly biased data, coupled with dodgy analysis, can lead to misleading outcomes. And the resulting misperceptions are hard to dismiss once they are “data-approved.”

For example, unreliable polling data and imprecise models led to many bad predictions about the 2016 US presidential elections. Analysts at FiveThirtyEight showed why we shouldn’t make conclusive claims based on polls, much in the same way that we can’t trust most of what we read about nutrition.

The media: Last but not least, the media adds to the distortion of facts. In an attention economy where our focus is scarce and news sources are plenty, the media inevitably sensationalizes some of the information they report in order to attract eyes and clicks. When describing complex phenomena, reporters can oversimplify issues and make topics such as scientific results seem more certain and universal than they actually are.

For example, on a 2016 episode of Last Week Tonight, John Oliver discussed how scientific insights get hyped in publishing translation. He uses the examples of farts curing cancer, hugs making you happy, and drinking wine providing the same benefits as working out, among others. Our own research on risk communication suggests that such headlines hide the uncertainties inherent in statistical findings, misleading not only journalists and audiences, but also experts.

The uncertainty embracers

Many scholars have studied perceptions of uncertainty, hoping to improve risk communication and reduce misperceptions. Here are some experts whose ideas encourage us to embrace uncertainty:

  • Nassim Taleb is a proponent of uncertainty awareness, having written influential bestsellers on how we underestimate the impact of seemingly improbable events, from global financial crises to unprecedented disasters to groundbreaking ideas.
  • Gerd Gigerenzer is another prolific scholar who has published a series of books on how to reckon with a wide variety of risks related to medical tests, health scares, and terrorism threats.
  • David Spiegelhalter has extensively researched how and when we (mis)understand uncertainty. He argues that we are easily manipulated by faulty statistics and that illustrations can help us visualize our uncertainty about the future.
  • Sam Savage argues that in order to embrace uncertainty, we need to appreciate how outcomes can vary by chance: Both his book Flaw of Averages and his NGO Probability Management educate decision makers about how the calculation and communication of risks should involve simulations that incorporate variations around expected outcomes.
  • Andrew Gelman works to make statistical information on politics and social sciences more accessible to general public.
  • A collection of Ben Goldacre’s numerous journalistic “fights” against statistical misrepresentations can be found in his aptly entitled book I Think You’ll Find it’s a Bit More Complicated Than That.
  • Noreena Hertz’s viral 2010 TED talk on why we shouldn’t equate expert advice with certainty has been viewed over 800,000 times.

When uncertainty matters the most

Our experience teaches us how to live with the uncertainties of frequently occurring events such as daily variations in the weather or the stock market. But we get anxious about uncertainties when the events are rare and the stakes are high: That’s why most of us panic in the face of a medical mystery, environmental disaster, financial crisis, or a presidential election. It’s also why we prefer leaders and authority figures who pretend to know exactly what to do all the time instead of acknowledging ambiguity.

Paradoxically, these are the very situations where we need to be skeptical about claims made with certainty and strive to have a clear and objective understanding of inherent uncertainties. Wisdom doesn’t emerge from knowing with certainty, but from being aware of your level of uncertainty.