In the 1970s, two psychologists proved, once and for all, that humans are not rational creatures. Daniel Kahneman and Amos Tversky discovered “cognitive biases,” showing that that humans systematically make choices that defy clear logic.
But what Kahneman and Tversky acknowledged, and is all too often overlooked, is that being irrational is a good thing. We humans don’t always make decisions by carefully weighing up the facts, but we often make better decisions as a result.
To fully explore this, it’s important to define “rational,” which is an unexpectedly slippery term. Hugo Mercier, a researcher at the Institut des Sciences Cognitives-Marc Jeannerod in France and the co-author of “The Enigma of Reason,” says that he’s never fully understood quite what “rational” means.
“Obviously rationality has to be defined according to how well you accomplish some goals. You can’t be rational in a vacuum, it doesn’t mean anything,” he says. “The problem is there’s so much flexibility in defining what you want.”
So, for example, it’s an ongoing philosophical debate about whether drug addicts are rational—as by taking drugs they are, after all, maximizing their pleasure, even if they harm themselves in the process.
Colloquially, “rational” has several meanings. It can describe a thinking process based on an evaluation of objective facts (rather than superstition or powerful emotions); a decision that maximizes personal benefit; or simply a decision that’s sensible. In this article, the first definition applies: Rational decisions are those grounded on solid statistics and objective facts, resulting in the same choices as would be computed by a logical robot. But they’re not necessarily the most sensible.
Trust your instincts
Despite the growing reliance on “big data” to game out every decision, it’s clear to anyone with a glimmer of self-awareness that humans are incapable of constantly rational thought. We simply don’t have the time or capacity to calculate the statistical probabilities and potential risks that come with every choice.
But even if we were able to live life according to such detailed calculations, doing so would put us at a massive disadvantage. This is because we live in a world of deep uncertainty, under which neat logic simply isn’t a good guide. It’s well-established that data-based decisions doesn’t inoculate against irrationality or prejudice, but even if it was possible to create a perfectly rational decision-making system based on all past experience, this wouldn’t be a foolproof guide to the future.
Unconvinced? There’s an excellent real-world example of this: The financial crisis. Experts created sophisticated models and were confident that the events of the 2007 crisis were statistically impossible. Gerd Gigerenzer, Director of the Max Planck Institute for Human Development in Germany, who studies decision-making in real world settings, says there is a major flaw in any system that attempts to be overly rational in our highly uncertain world.
“If you fine-tune on the past with an optimization model, and the future is not like the past, then that can be a big failure, as illustrated in the last financial crisis,” he explains. “In a world where you can calculate the risks, the rational way is to rely on statistics and probability theory. But in a world of uncertainty, not everything is known—the future may be different from the past—then statistics by itself cannot provide you with the best answer anymore.”
Henry Brighton, a cognitive science and artificial intelligence professor at Tilburg University in the Netherlands, who’s also a researcher at the Max Planck Institute, adds that, in a real-world setting, most truly important decisions rely at least in part on subjective preferences.
“The number of objective facts deserving of that term is extremely low and almost negligible in everyday life,” he says. “The whole idea of using logic to make decisions in the world is to me a fairly peculiar one, given that we live in a world of high uncertainty which is precisely the conditions in which logic is not the appropriate framework for thinking about decision-making.”
Let your gut guide you
Instead of relying on complex statistics to make choices, humans tend to make decisions according to instinct. Often, these instincts rely on “heuristics,” or mental shortcuts, where we focus on one key factor to make a decision, rather than taking into account every tiny detail.
However, these heuristics aren’t simply time-savers. They can also be incredibly accurate at selecting the best option. Heuristics tune out the noise, which can mislead an overly-complicated analysis. This explains why simply dividing your money equally among assets can outperform even the most sophisticated portfolios.
“In a world where all options and probabilities are known, a heuristic can only be faster but never more accurate,” says Gigerenzer. “In a world of uncertainty, which is typically the situation we face, where one cannot optimize by definition, heuristics tend to be more robust.”
For example, the recognition heuristic explains why we’re more likely to buy a product we know, or look for familiar faces in a crowd. And though this can be taken advantage of by advertisers, Gigerenzer’s work has shown that name recognition can predict the winners of Wimbledon tournaments better than the complex ATP rankings or other criteria.
Though they’re not perfect in all circumstances—our instincts can lead us to bias or racist assumptions, for example—heuristics are a highly useful tool for making decisions in our unstable world. “These are evolved capacities that have probably evolved for a reason,” says Brighton. “You could argue it’s irrational to try and weigh up all these unknown factors and it’s more rational to try and rely on their gut—which, for all we know, may be taking into account cues that aren’t obvious.”
Kahneman and Tversky recognized that heuristics and cognitive biases can be highly effective mechanisms, but all too often these biases are portrayed as flaws in our thought process. However, Gigerenzer insists that such biases are only weaknesses in very narrow settings. Cognitive biases tend to be highlighted in lab experiments, where the human decisions are contrasted with probability theory. This is often “the wrong yardstick,” says Brighton.
For example, hyperbolic discounting is a well-known cognitive bias, whereby people will instinctively prefer $50 now over $100 in a year’s time, even though that ultimately leads to a lesser reward. But while that may seem silly in a perfect economic model setting, imagine the scenario in the real world: If a friend offered you a sum of money now or double in twelve months time, you might well go ahead and take the money immediately on offer. After all, he could forget, or break his promise, or you could become less friendly. The many variables in the real world mean that it makes sense to hold on to whatever rewards we can quickly get our hands on.
We need hot-headed, emotional decisions
Though calling someone hot-headed or overly emotional is generally a critique of their thinking process, emotions are in fact essential to decision-making. There’s even research to show that those who suffer brain damage in the part of the organ governing emotions often struggle to make decisions. They can weigh up the pros and cons, but can’t plump down on one side.
This makes sense, given that positive emotions are often the ultimate ends of our decisions—we can only choose what course to take if we know what will make us happy. “You can very well know that the world is going to end tomorrow but if you have no desire to live or do anything then you shouldn’t give a damn about it. Facts on their own don’t tell you anything,” says Mercier. “It’s only paired with preferences, desires, with whatever gives you pleasure or pain, that can guide your behavior. Even if you knew the facts perfectly, that still doesn’t tell you anything about what you should do.”
Though emotions can derail highly rational thought, there are occasions where overly rational thinking would be highly inappropriate. Take finding a partner, for example. If you had the choice between a good-looking high-earner who your mother approves of, versus someone you love who makes you happy every time you speak to them—well, you’d be a fool not to follow your heart.
And even when feelings defy reason, it can be a good idea to go along with the emotional rollercoaster. After all, the world can be an entirely terrible place and, from a strictly logical perspective, optimism is somewhat irrational. But it’s still useful. “It can be beneficial not to run around in the world and be depressed all the time,” says Gigerenzer.
The same goes for courage. Courageous acts and leaps of faith are often attempts to overcome great and seemingly insurmountable challenges. (It wouldn’t take much courage if it were easy to do.) But while courage may be irrational or hubristic, we wouldn’t have many great entrepreneurs or works of art without those with a somewhat illogical faith in their own abilities.
We don’t make decisions in isolation
There are, of course, occasions where we’d benefit from humans being more rational. Like politics, for example. The fallibility of human reasoning has been much discussed recently following unexpected and controversial populist uprisings (such as Britain’s “Brexit” referendum and the election of US president Trump.) There’s understandable consternation about why people would vote against their own interests.
But, as a recent New Yorker piece explains, our attitude to facts makes evolutionary sense given that humans developed to be social creatures, not logicians analyzing GDP trends. Dan Sperber, a cognitive scientist at Central European University and Mercier’s co-author, says that the social implications of any decision are far from irrelevant. “Even if a decision seems to bring a benefit, if it is ill-judged by others, then there’s a cost,” he says. “The main role of reasoning in decision-making is not to arrive at the decision but to be able to present the decision as something that’s rational.”
He believes we only use reason to retrospectively justify the decision, and largely rely on unquestioned instincts to make choices. It makes good sense that, on occasion, instincts would encourage us to arrive at the same conclusion as those around us. After all, endless arguments about who’s right can easily lead to social ostracization.
Similarly, we’re happy to unthinkingly agree with others’ seeming expertise because this trait is key to our capacity to collaborate. It can be problematic when we unquestioningly go along with pundits on TV, but it does have its uses.
“Relying on our community of knowledge is absolutely critical to functioning. We could not do anything alone,” says Philip Fernbach, cognitive scientist at the University of Colorado. “This is increasingly true. As technology gets more complex it is increasingly the case that no one individual is a master of all elements of it.”
Even the cognitive biases that can lead to irrational political decisions do have some advantages. After all, refusing to rely on others’ reasoning and failing to consider how our responses would be socially received would likely leave us isolated and unable to get much done.
Of course, no human is perfect, and there are downsides to our instincts. But, overall, we’re still far better suited to the real world than the most perfectly logical thinking machine. We’re inescapably irrational, and far better thinkers as a result.