You make thousands of decisions every day — from what to eat for breakfast to which job offer to take.
And you might think you approach all those decisions rationally.
Yet research suggests there are a huge number of cognitive stumbling blocks that can affect our behavior, preventing us from acting in our own best interests.
Here, we’ve rounded up some of the most commonly cited biases that screw up our decision-making.
People are overreliant on the first piece of information they hear.
In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person’s mind.
Any counteroffer will naturally be anchored by that opening offer.
When people overestimate the importance of information that is available to them.
For instance, a person might argue that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day.
The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink — and it’s a reason meetings are often unproductive.
Failing to recognize your cognitive biases is a bias in itself.
Notably, Princeton psychologist Emily Pronin has found that “individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves.”
When you choose something, you tend to feel positive about it, even if the choice has flaws. You think that your dog is awesome — even if it bites people every once in a while — and that other dogs are stupid, since they’re not yours.
This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.
We tend to listen only to the information that confirms our preconceptions — one of the many reasons it’s so hard to have an intelligent conversation about climate change.
Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding that the planet was flat.
The tendency to seek information when it does not affect action. More information is not always better. Indeed, with less information, people can often make more accurate predictions.
The decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.
But there’s an upside to acting like a big bird, at least for investors. When you have limited knowledge about your holdings, you’re less likely to trade, which generally translates to higher returns in the long run.