51 cognitive biases that screw up everything we do

We like to think we’re rational human beings.

In fact, we are prone to hundreds of proven biases that cause us to think and act irrationally, and even thinking we’re rational despite evidence of irrationality in others is known as blind spot bias.

The study of how often human beings do irrational things was enough for psychologist Daniel Kahneman to win the Nobel Prize in Economics, and it opened the rapidly expanding field of behavioural economics. Similar insights are also reshaping everything from marketing to criminology.

Hoping to clue you — and ourselves — into the biases that frame our decisions, we’ve collected a long list of the most notable ones.

This is an update of an article originally written by Gus Lubin, with additional contributions by Drake Baer.

Anchoring bias

People are overreliant on the first piece of information they hear.

In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind. Any counteroffer will naturally react to or be anchored by that opening offer.

'Most people come with the very strong belief they should never make an opening offer,' said Leigh Thompson, a professor at Northwestern University's Kellogg School of Management. 'Our research and lots of corroborating research shows that's completely backwards. The guy or gal who makes a first offer is better off.'

Observer-expectancy effect

A cousin of confirmation bias, here our expectations unconsciously influence how we perceive an outcome. Researchers looking for a certain result in an experiment, for example, may inadvertently manipulate or interpret the results to reveal their expectations.

That's why the 'double-blind' experimental design was created for the field of scientific research.

Clustering illusion

http://en.wikipedia.org/wiki/File:Roulette_-_detail.jpg

This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

Conservatism bias

http://en.wikipedia.org/wiki/File:Flammarion.jpg

Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding the planet was flat.

Conformity

Drake Baer/BI

This is the tendency of people to conform with other people. It is so powerful that it may lead people to do ridiculous things, as shown by the following experiment by Solomon Asch.

Ask one subject and several fake subjects (who are really working with the experimenter) which of lines B, C, D, and E is the same length as A? If all of the fake subjects say that D is the same length as A, the real subject will agree with this objectively false answer a shocking three-quarters of the time

'That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern,'Asch wrote. 'It raises questions about our ways of education and about the values that guide our conduct.'

Curse of knowledge

http://en.wikipedia.org/wiki/File:Sheldon_Cooper.jpg

When people who are more well-informed cannot understand the common man. For instance, in the TV show 'The Big Bang Theory,' it's difficult for scientist Sheldon Cooper to understand his waitress neighbour Penny.

Duration neglect

University of Hong Kong

When the duration of an event doesn't factor enough into the way we consider it. For instance, we remember momentary pain just as strongly as long-term pain.

Kahneman and colleagues tracked patients' pain during colonoscopies (they used to be more uncomfortable) and found that the end of the procedure pretty much determined patients' evaluations of the entire experience. One set of patients underwent a shorter procedure in which the end was relatively painful. The other set of patients underwent a longer procedure in which the end was less painful.

Results showed that the second set of patients (the longer colonoscopy) rated the procedure as less painful overall.

Galatea effect

http://en.wikipedia.org/wiki/File:Galatea_Raphael.jpg
Galatea by Raphael

Where people succeed -- or underperform -- because they think they should.

Call it a self-fulfilling prophecy. For example, in schools it describes how students who are expected to succeed tend to excel and students who are expected to fail tend to do poorly.

Halo effect

Where we take one positive attribute of someone and associate it with everything else about that person or thing.

It helps explain why we often assume highly attractive individuals are also good people, why they tend to get hired more easily, and why they earn more money.

Hard-easy bias

Where everyone is overconfident
on easy problems and not confident enough for hard problems.

Herding

Wikipedia

People tend to flock together, especially in difficult or uncertain times.

Hyperbolic discounting

Tony Manfred/Business Insider

The tendency for people to want an immediate payoff rather than a larger gain later on.

Irrational escalation

Shutterstock

When people make irrational decisions based on past rational decisions. It may happen in an auction, when a bidding war spurs two bidders to offer more than they would other be willing to pay.

Ostrich effect

Mark Kolbe / Getty Images

The decision to ignore dangerous or negative information by 'burying' one's head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.

But there's an upside to acting like a big bird, at least for investors. When you have limited knowledge about your holdings, you're less likely to trade, which generally translates to higher returns in the long run.

Overconfidence

Chris Hondros/Getty Images

Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives.

Perhaps surprisingly, experts are more prone to this bias than laypeople. An expert might make the same inaccurate prediction as someone unfamiliar with the topic -- but the expert will probably be convinced that he's right.

Pessimism bias

This is the opposite of the overoptimism bias. Pessimists over-weigh negative consequences with their own and others' actions.

Those who are depressed are more likely to exhibit the pessimism bias.

Placebo effect

When simply believing that something will have a certain impact on you causes it to have that effect.

This is a basic principle of stock market cycles, as well as a supporting feature of medical treatment in general. People given 'fake' pills often experience the same physiological effects as people given the real thing.

Post-purchase rationalization

Alex Davies / Business Insider

Making ourselves believe that a purchase was worth the value after the fact.

Priming

Priming is where if you're introduced to an idea, you'll more readily identify related ideas.

Let's take an experiment as an example, again from Less Wrong:

Suppose you ask subjects to press one button if a string of letters forms a word, and another button if the string does not form a word. (E.g., 'banack' vs. 'banner'.) Then you show them the string 'water'. Later, they will more quickly identify the string 'drink' as a word. This is known as 'cognitive priming' ...

Priming also reveals the massive parallelism of spreading activation: if seeing 'water' activates the word 'drink', it probably also activates 'river', or 'cup', or 'splash.'

Pro-innovation bias

Daniel Goodman / Business Insider

When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?

Procrastination

Deciding to act in favour of the present moment over investing in the future.

For example, even if your goal is to lose weight, you might still go for a thick slice of cake today and say you'll start your diet tomorrow.

That happens largely because, when you set the weight-loss goal, you don't take into account that there will be many instances when you're confronted with cake and you don't have a plan for managing your future impulses.

Recency

The tendency to weigh the latest information more heavily than older data.

As financial planner Carl Richards writes in The New York Times, investors often think the market will always look the way it looks today and therefore make unwise decisions: 'When the market is down we become convinced that it will never climb out, so we cash out our portfolios and stick the money in a mattress.'

Salience

Our tendency to focus on the most easily-recognisable features of a person or concept.

For example, research suggests that when there's only one member of a racial minority on a business team, other members use that individual's performance to predict how any member of that racial group would perform.

Scope insensitivity

Hector Guerrero/VICE News

This is where your willingness to pay for something doesn't correlate with the scale of the outcome.

From Less Wrong:

Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $US80, $US78, and $US88. This is scope insensitivity or scope neglect: the number of birds saved -- the scope of the altruistic action -- had little effect on willingness to pay.

Self-enhancing transmission bias

Boonsri Dickinson, Business Insider

Everyone shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.

Stereotyping

Expecting a group or person to have certain qualities without having real information about the individual.

There may be some value to stereotyping because it allows us to quickly identify strangers as friends or enemies. But people tend to overuse it.

For example, one study found that people were more likely to hire a hypothetical male candidate over a female candidate to perform a mathematical task, even when they learned that the candidates would perform equally well.

Survivorship bias

Facebook/Mark Zuckerberg
Facebook CEO Mark Zuckerberg

An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs who have failed.

It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck or other factors.

Zero-risk bias

Andy Cross/The Denver Post via Getty Images

Sociologists have found that we love certainty -- even if it's counterproductive.

Thus the zero-risk bias.

In general, people tend to prefer approaches that eliminate some risks completely, as opposed to approaches that reduce all risks -- even though the second option would produce a greater overall decrease in risk.

NOW WATCH: Ideas videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.