Our traditional economic models assume that we act as rational agents. The general belief is that we all behave in a way that will maximize our utility value, and thus enable us to thrive in the world.
On the surface, this may not appear to be anything to worth disputing, but it is. As much of the work in behavioral economics has shown over the past few decades, this idealistic image of how we think just isn’t true.
I am not a rational person, and nor are you. We may strive to be, and we may aim to maximize our utility value, but there is a natural gap between the end and the means. Our brain has known limitations that hold us back. Over the past few decades, researchers have discovered many hundreds of subtle mental biases and errors that cause us to act irrationally. Most of them are intuitive to how we are programmed to exist, and many of them get in the way of us optimizing our lives for the best possible outcome.
A full list could probably fill a book, but some of them are more prominent than others, and they tend to affect us more negatively, too. Here are five that I see come up again and again.
When it comes to information, we have a natural tendency to presume that more is better. We plan for longer, we collect more notes, and we delay action. The problem is that most of this information doesn’t affect whatever it is we’re trying to accomplish. It’s really easy for us to get caught up wanting to grab everything that somehow appears connected to our objective, but really isn’t.
In fact, when it comes to predicting future trends, some studies have shown that people actually make more accurate predictions when they have less information to go off. There is less room for irrelevant ideas to get in the way.
This is also true of taking productive action. Less is generally more. Rather than collecting all the available information, it’s far more ideal to honestly evaluate which is high-impact and which isn’t. Once the high-impact ideas have been covered, having more has diminishing and even negative returns.
It’s often more costly to delay action than it is to miss out on more details.
A number of studies have shown that we often base the decisions we make on visible outcomes while sometimes completely ignoring the preceding process.
An extreme case of this is apparent in the mind of a gambler. When a compulsive gambler wins, to him, it’s a sign that he should play again. He ignores the process that caused that win (in many cases, it’s inspired by pure luck), and presumes that the outcome is the evidence he needs to continue. Naturally, given that the process will be different the next time he plays, it’s very unlikely that he is going to get the result he wants. Even if luck happens to strike twice, over the long-term, this is a losing proposition.
A past outcome is not indicative of the future result unless the process behind that outcome is likely to be replicated in the forthcoming effort. Outcome bias leads to aimless and misdirected action, and it’s often harmful.
The halo effect explains why it’s so easy for us to idolize certain objects.
The human brain has a habit of taking one positive attribute about a person or a thing and using that impression to create an association of general competence relating to other qualities that this person or thing possesses too. For example, it’s no coincidence that we often assume that attractive people are also nicer and kinder, more capable at their jobs, and deserve higher pay.
Similarly, it also explains why we are quick to overlook the faults and deficits of certain people and things we have developed an initial attachment to. There is less of a relation between the different qualities that someone or something possesses than the halo effect makes it seem.
Looks, talent, and kindness, for example, are all different traits, and they should be evaluated separately. Additionally, if someone is wrong about something, your first impression of them isn’t a good reason to ignore that.
Everybody loves a good success story. When we hear one, most of us immediately gravitate towards it. We crave the details in hopes of a lesson. However, the problem with success stories is that they skew our perception of reality. For every one person who succeeded by employing that lesson, there are maybe 10 others that got nowhere by doing the same thing.
Just because Mark Zuckerberg and Bill Gates dropped out of college to become entrepreneurs and succeeded in doing so, it doesn’t mean that you will have similar luck. There is a whole world of people who also dropped out of college that went nowhere that nobody ever talks about.
Survivorship bias explains our tendency to assume and apply what worked for someone else to our own situation, without looking at every side of the story. History remembers winners, but rarely losers, and as such, we generally overvalue the lessons we can learn from the people who made it.
Most learning occurs when we are wrong. It’s when something causes us to question the existing beliefs we have about how the world works. Unfortunately, doing so is really hard, because confirmation bias is one of the most prominent errors that our brain consistently makes. It’s not concerned with the truth of a situation, but with ensuring that we find evidence that supports our existing way of thought. It tells us what we want to hear.
This explains why it’s so difficult for people to change their minds about something once they have decided on a particular position. If you identify with a certain political party, chances are that you are going to cherry-pick the facts that reinforce your distaste for the opposing party and find evidence that’s going to convince you of how great your candidate is.
The longer you believe something, and the stronger your position, the more likely you are to torture reality to fit a false narrative that supports you.
Daniel Kahneman is the Nobel Prize-winning psychologist who has influenced much of the work being done in this domain today. He tends to be fairly pessimistic about our ability to overcome these mental hindrances.
Other researchers disagree. While some of these biases are deeply ingrained in our mind, with self-awareness, they believe, we can at least be a little more cautious and thus prepare ourselves to not go down that road as easily.
In my own experience, I find the latter to be true. I know that I definitely still make these errors, but I also know that since I have become aware of them and their negative consequences, they have become less frequent. Even if my intuitive compass leans me towards a bias, my habit of pausing before I go further has done a lot in helping me catch my own errors.
Seek out your biases, internalize their existence, and think before you act. It’s simple, and maybe not completely foolproof, but it helps. You’ll be much better for it.
Want to think and live smarter? Zat Rana publishes a free weekly newsletter for 10,000+ readers at Design Luck.