We all view the world through different lenses. This has to do with our upbringing, our collective experiences, and our subconscious behavioural biases.
Until we take a deep look into why we see the world the way we do, we’ll never have the capacity for real change.
Inherent biases cause us to make snap judgments based on bad information, to be unfair and to waste time. This is clearly problematic for investors, managers and people in general.
We’ve collected a long list of cognitive biases from the Singularity Institute, Tim Richard’s Psy-Fi Blog and more, to bring these biases to light so we can disrupt our thinking and come to terms with reality.
Where people overestimate the importance of information that is available to them.
One example would be a person who argues that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.
Read more about the availability heuristic.
A bias where people make faulty conclusions based on what they already believe or know. For instance, one might conclude that all tiger sharks are sharks, and all sharks are animals, and therefore all animals are tiger sharks.
Read more about belief bias.
If you fail to realise your own cognitive biases, you have a bias blind spot. Everyone thinks they're not as biased as people may think, which is a cognitive bias itself.
Read more about bias blind spot.
A bias in which you think positive things about a choice once you made it, even if that choice has flaws. You may say positive things about the dog you just bought and ignore that the dog bites people.
Read more about choice-supportive bias.
To conclude that data contains a 'streak' or 'cluster' when that set is actually random.
For instance in basketball, the hot hand effect is the belief that a player who has hit several shots in a row is more likely to hit the next shot.
This is also called the gambler's fallacy, where one thinks a winning number is 'due.'
Read more about the clustering illusion.
Where people believe prior evidence more than new evidence or information that emerged. People were slow to accept the fact that the earth was round because they tended to believe earlier information that it was flat.
Read more about conservatism.
When people who are smarter or more well informed can not understand the common man. For instance, in the TV show 'The Big Bang Theory' it's difficult for scientist Sheldon Cooper to understand his waitress neighbour Penny.
Read more about the curse of knowledge.
Where people in one state fail to understand people in another state. If you are happy you can't imagine why people would be unhappy. When you are not sexually aroused, you can't understand how you act when you are sexually aroused.
Read more about the empathy gap.
Where an idea causes you to have an unconscious physical reaction, like a sad thought that makes your eyes tear up. This is also how Ouija boards seem to have minds of their own.
Read more about ideometer effect.
When weak but consistent data leads to confident predictions. Like one commenter noted on the MIT admissions blog:
Why is MIT's admissions process better than random? Say you weeded out the un-qualified (the fewer-than-half of applicants insufficiently prepared to do the work at MIT) and then threw dice to stochastically select among the remaining candidates. Would this produce a lesser class?
Read more about illusion of validity.
Investing more money or resources into something based on prior investment, even if you know it's a bad one. 'I already have 500 shares of Lehman Brothers, let's buy more even though the stock is tanking.'
Read more about irrational escalation.
The tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that 'bad is stronger than good' and will perceive threats more than opportunities in a given situation.
This leads toward loss aversion.
Read more about negativity bias.
Our expectations unconsciously influence how we perceive an outcome. Researchers, for example, looking for a certain result in an experiment, may inadvertently manipulate or interpret the results to reveal their expectations. That's why the 'double-blind' experimental design was created for the field of scientific research.
Read more about observer-expectancy effect.
The tendency to judge harmful actions as worse than equally harmful inactions. For example, we consider it worse to crash a car while drunk than to let one's friend crash his car while drunk.
Read more about omission bias.
When we believe the world is a better place than it is, we aren't prepared for the danger and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.
Read more about optimism bias.
People take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean.
Read more about regression bias.
Over-reliance on expert advice. This has to do with the avoidance or responsibility. We call in 'experts' to forecast, when in fact, they have no greater chance of predicting an outcome than the rest of the population. In other words, 'for every seer there's a sucker.'
Read more about seersucker illusion.
An error that comes from focusing only on surviving examples, causing us to misevaluated a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs that have failed.
It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck.
Read more about survivorship bias.
We overuse common resources because it's not in any individual's interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.
Read more about the tragedy of the commons.
We believe that there is an optimal unit size, or a universally-acknowledged amount of a given item that is perceived as appropriate. This explains why when served larger portions, we eat more.
Read more about unit bias.
The preference to reduce a small risk to zero versus achieving a greater reduction in a greater risk.
This plays to our desire to have complete control over a single, more minor outcome, over the desire for more -- but not complete -- control over a greater, more unpredictable outcome.
Read more about zero-risk bias.
Business Insider Emails & Alerts
Site highlights each day to your inbox.