What Mental Biases do you have?

After reading about these mental biases, you’ll probably say that you knew them all along… that’s an example of a mental bias, just what we’ll discuss today.

I recently came across Charlie Munger’s 1995 speech, The Psychology of Human Misjudgment, which introduced me to an explanation of what I’d been doing all my business life – using the power of  applying mental tools from a wide array of disciplines and applying these to the business questions I had.

Last week I introduced tools for explaining. Today’s mental tools or analogies, or models or heuristics about mental biases that could inhibit your ability to work- are concepts you can use to help try to explain things to find the answers to the critical questions in your business.  

Today let’s discuss Cognitive Biases — “Tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgments.”  We are all biased in some way, and by that I mean our brain make us misjudge things. So its best to try to understand just how biased we are and biased in which ways.

Here are some of the biases or misjudgements we can become subject to, grouped under a series of shortcuts in thinking that bias us towards bad decisions. These short cuts or Heuristics are described as “judgments that generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course.” Heuristics are useful because they use effort-reduction and simplification in decision-making so long as they lead us in the right direction- here are some that put us wrong.

The availability heuristic  is a mental shortcut that relies on immediate examples that come to your mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, you tend to heavily weigh your judgments toward more recent information, making new opinions biased toward that latest news over older information.


The representativeness heuristic is used when making judgments about the probability of an event occurring when you are uncertain about it. It is one of a group of heuristics (simple rules governing judgment or decision-making) proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s.  They defined representativeness as “the degree to which something is similar in characteristics to the overall parent population, and reflects the critical features of the population. When people rely on representativeness to make judgments, they are likely to judge wrongly because either what they thing is representative of the popularion actually isn’t and just because a think represents a populations doesn’t make it the right answer.

– Insensitivity to base rates  You ignore information about the overall population in favour of specific information about a single example.

– Insensitivity to sample size. Smaller samples give greater errors, larger samples are a better mirror of the overall population. Variation is more likely in smaller samples, but people may not expect this.


– Misconceptions of chance is the mistaken belief that, if something happens more frequently than normal during a period, it will thus happen less frequently in the future, or that, if something happens less frequently than normal during some period, it will happen more frequently in the future (presumably as a means of balancing nature).  If you toss 5 heads in a row, you think tails is more likely to come up next… its won’t its still 50-50.

– Regression to the mean is the phenomenon that if a variable is extreme on its first measurement, it will tend to be closer to the average on its second measurement— and if it is extreme on its second measurement, it will tend to have been closer to the average on its first. I would agree with this, as long as the underlying situation remains the same.  But if the environment changes things may NOT regress to the old mean. 

regression to mean

Biases emanating from the Confirmation Heuristic is the tendency to search for, interpret, favour, and recall information in a way that confirms one’s preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.


Anchoring bias is the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions. During decision making, anchoring occurs when individuals use an initial piece of information to make subsequent judgments. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth


Over-confidence bias occurs when your subjective confidence in your judgment is perceived by you to be greater than the actual and objective accuracy of those judgments, especially when your confidence is relatively high. Overconfidence is one example of a mis-calibration of subjective probabilities. There are three types of overconfidence : (1) overestimation of one’s actual performance; (2) overplacement of one’s performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one’s beliefs.


– Hindsight Bias also known as the confirmation bias is the feeling, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.  A basic example of the hindsight bias is when, after viewing the outcome of a potentially unforeseeable event, a person believes he or she “knew it all along”. Such examples are present in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems trying to attribute responsibility and predictability of accidents.  This build false confidence in your ability to predict what could occur in future.