We like to think that we’re totally logical when we make decisions.
But over the past few decades, social science has uncovered a staggering number of cognitive biases that shape our behaviour — whether we know it or not.
We’ve collected a list of the most common ones.
People are overreliant on the first piece of information they hear.
In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind.
Any counteroffer will naturally be anchored by that opening offer.
The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink -- and it's a reason meetings are often unproductive.
This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.
This is the tendency of people to conform with other people. It is so powerful that it may lead people to do ridiculous things, as shown by the following experiment by legendary psychologist Solomon Asch.
In experiments, he found that three-quarters of people would say that two lines of different length are actually the same length -- so long as everybody else in the room is saying so.
Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding that the planet was flat.
When people who are well-informed cannot understand the common man. For instance, in the TV show 'The Big Bang Theory,' it's difficult for scientist Sheldon Cooper to understand his waitress neighbour Penny.
A phenomenon in marketing where consumers have a specific change in preference between two choices after being presented with a third choice.
Offer two sizes of soda and people may choose the smaller one; but offer a third even larger size, and people may choose what is now the medium option.
Where people succeed -- or underperform -- because they think they should.
Call it a self-fulfilling prophecy. For example, in schools it describes how students who are expected to succeed tend to excel and students who are expected to fail tend to do poorly.
Of course Apple and Google would become the two most important companies in phones. Tell that to Nokia, circa 2003.
A cousin of confirmation bias, here our expectations unconsciously influence how we perceive an outcome.
Researchers looking for a certain result in an experiment, for example, may inadvertently manipulate or interpret the results to reveal their expectations.
The tendency to prefer inaction to action, in ourselves and even in politics.
Psychologist Art Markman gave a great example in 2010:
In March, President Obama pushed Congress to enact sweeping health care reforms. Republicans hope that voters will blame Democrats for any problems that arise after the law is enacted.
But since there were problems with health care already, can they really expect that future outcomes will be blamed on Democrats, who passed new laws, rather than Republicans, who opposed them? Yes, they can -- the omission bias is on their side.
Priming is where if you're introduced to an idea, you'll more readily identify related ideas.
For instance, if you show somebody the word water, they will be more likely to identify the words river, cup, or splash afterward.
Everyone shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.
It's also why people seem way happier on Instagram than anyone could be in real life.
An error that comes from focusing only on surviving examples, causing us to misjudge a situation.
For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs who have failed.
Sociologists have found that we love certainty -- even if its counter productive.
Thus the zero-risk bias.
'Zero-risk bias occurs because individuals worry about risk, and eliminating it entirely means that there is no chance of harm being caused,' says decision science blogger Steve Spaulding. 'What is economically efficient and possibly more relevant, however, is not bringing risk from 1% to 0%, but from 50% to 5%.'
NOW WATCH: Ideas videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.