The Art of Thinking Clearly

Author: Rolf Dobelli

Rating:
3.7/5

Themes: Mental Models

Summary Sentence: Think more clearly by learning 99 of the biggest “cognitive errors” – repeated failures to think clearly that humans tend to make.
Review: Each chapter uses about 3 pages to describe a “cognitive error” with fun, interesting examples and a little advice on how to handle them. Most chapters don’t have clear definitions of the errors, rather they are explained through the examples. Overall the book was thought provoking and entertaining to read. 
Other Resources: Amazon | Goodreads | Four Minute Books | Graham Mann | Farnam Street | Best Book Bits (Youtube)

Cognitive Error: “The failure to think clearly, or what experts call a “cognitive error,” is a systematic deviation from logic—from optimal, rational, reasonable thought and behavior. By “systematic,” I mean that these are not just occasional errors in judgment but rather routine mistakes, barriers to logic we stumble over time and again, repeating patterns through generations and through the centuries.”

“…my wish is quite simple: If we could learn to recognize and evade the biggest errors in thinking—in our private lives, at work, or in government—we might experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”

Cognitive Errors:

1 Survivorship Bias: Success stories are more visible than failures. Therefore, we tend to think we are more capable of success than we really are. If you look at the failures you’ll notice that they have many of the same things you thought lead to success.

2 Swimmer’s Body Illusion: When we confuse selection factors with results. You can’t get the body of a great swimmer (result) by just swimming a lot. Great swimmers are great because they genetically have swimmer bodies (selection factor).

3 Clustering Illusion: Humans have a hard time accepting that random data has clusters (ex: rolling 4 six times in a row). We are oversensitive to pattern recognition. It might just be due to luck or chance.

4 Social Proof: In order to know what’s correct we often look at other people. The more people we see, the more correct we think it is.

5 Sunk Cost Fallacy: People want to continue doing something after they’ve invested a lot of time, energy, money, or love into it (Ex: We’ve already dated for 5 years… I already spent 2 years on this course). At a given point in time you will have spent it regardless, so don’t let it cloud your judgment. Forget about the sunk cost. Only look at future costs and benefits.

6 Reciprocity: People don’t like to be in someone else’s debt. We feel the urge to return the favor. We also feel the need to seek revenge and retaliate.

7 Confirmation Bias: People tend to focus on information that supports our existing beliefs and filter out anything that doesn’t. Instead we should thoroughly investigate things we don’t initially agree with to determine if we are actually wrong.

8 Confirmation Bias: See above

9 Authority Bias: People are more obedient towards people they view as authority figures. Authority figures might have ulterior motives and are using their authority to manipulate you. Challenge them.

10 Contrast Effect: “We judge something to be beautiful, expensive, or large if we have something ugly, cheap, or small in front of us.” You might never purchase a tie for $50. But if you just bought a $700 suit, the extra $50 doesn’t seem so bad.

11 Availability Bias: We are more influenced by information that easily comes to mind or is easy to obtain. We don’t want to do the work of looking for the more important (or correct) but harder to get information. However, in reality things don’t happen more just because we can think of them more easily.

12 It’ll Get Worse Before It Gets Better Fallacy: A phrase used where if a bad situation gets worse the predictor looks accurate and if the situation gets better the predictor looks skilled at resolving the situation. Either way the predictor wins.

13 Story Bias: People like things to have coherent stories even if it’s not accurate. We contribute WWII to Hitler even though its upbringing was more complex. We like our lives to have a meaning, guiding principle, or calling, even though life is complex and people can have multiple meanings for life.

14 Hindsight Bias: We tend to view the past as clearer, more inevitable, and more predictable than it really was. People are bad predictors and the world is a very unpredictable place. “I knew that would happen”. Hindsight is 20/20.

15 Overconfidence Effect: People tend to be more confident in their knowledge or abilities than they should be. For example, 84% of Frenchmen think they are above average lovers but it should be 50%. Be careful to be realistic when estimating your knowledge and making plans.

16 Chauffeur Knowledge: Chauffeur Knowledge is not real since it doesn’t come from understanding. It comes from listening to others and regurgitating what they say. For example, a blogger that makes a short, shallow post by reading other people’s posts. Stay in your circle of competence.

16.1 Circle of Competence: Out of all the knowledge in the world there is only a small circle of it that you actually understand. Stay in your circle.

17 Illusion of Control: People tend to think they have more control over the world than they do. The world around you is chaotic and most of it is outside your control. For example, many traffic light buttons are fake. Despite this, people tend to feel like they are influencing the lights when they click the button. In life, focus on the things that are actually in your control.

18 Incentive Super-Response Tendency: People respond to incentives with actions that are in their own self-interest and often disregard the underlying intentions. A manager wants his company to produce more nails so he rewards employees by the quantity of nails they produce. Thus the employees start cutting their nails in half to make two smaller ones.

19 Regression to the Mean: When unusual and outlier events occur, the following events will likely be more usual (like the average or mean). Beginner’s luck is short lived.

20 Outcome Bias: People tend to evaluate decision processes based on their results in the past and not on the process itself. “It worked last time so it should work this time, right?” The previous success might have been luck. The current conditions might be too different.

21 Paradox of Choice: People like to have more options up to a limit. Beyond that limit they face conditions like analysis paralysis and fear of making a suboptimal choice (regret). When you have a lot of options, just aim for good enough.

22 Liking Bias: We are more agreeable with people that we like. Watch out for people mirroring your behaviors and giving you complements.

23 Endowment Effect: We tend to think things are more valuable if we own them. You might charge $150 for your couch on Ebay but if you were the one buying there might be now way you’d charge $150 for that.

24 Coincidence: People tend to think rare or unusual events are more remarkable than they really are. People think about their friends a lot everyday. The fact that your friend called after you thought of him isn’t that surprising. Improbably coincidences are rare but it would be more remarkable if they never happened at all.

25 Groupthink: When people align their decisions just because everyone else is aligning with it. Nobody wants to be the naysayer and break up the team spirit. But that’s better than making reckless decisions.

26 Neglect of Probability: Humans lack an “intuitive grasp of probability”. We end up responding to the magnitude of an event rather than its likelihood. For example, a 0 percent risk seems so much better to us than a 0.1 percent risk. People are more scared of terrorist attacks than they are of heart disease.

27 Scarcity Error: People think things are more valuable when their availability is limited. We should assess things based on their utility and benefits, not on how many there are.

28 Base-Rate Neglect: People tend to disregard the base rate (percentage that are likely to have the quality). You might think it’s more likely that a man was stabbed by a Russian knife importer than a middle-class American, however that are many more middle-class Americans than Russian knife importers.

29 Gambler’s Fallacy: People tend to think that previous events will affect their next results even though they are independent. A die cannot remember how many times it landed on 4. There is no balancing force in the universe. (Note: Regression to the Mean refers to complex systems which have dependent events).

30 The Anchor: Humans are influenced by anchors (something we base our thoughts off of) when we think about things. After being told to select random numbers people who chose higher numbers ended up bidding higher in an auction. Teacher’s grading is influenced if they know students’ past grades.

31 Induction: We tend to make universal conclusions from individual observations. Everyday a turkey is fed by a farmer so he thinks the farmer loves him, until one day the farmer chops off his head. We think humans have survived so long that we will continue to survive forever. The world can always change.

32 Loss Aversion: People are more inclined to avoid losses than they are to make similar gains. Losing $10 costs you about double the happiness that you would have had if you gained $10.

33 Social Loafing: When individual performance is not directly visible people tend to slack and blend into the group. “Why invest all of your energy when half will do…”

34 Exponential Growth: People have a bad intuition for exponential growth and compounding. Small changes can lead to something really catastrophic over time but they go unnoticed.

35 Winner’s Curse: “…the winner of an auction often turns out to be the loser.” In order to secure a job or purchase a product we will try to outdo our competitors. This leads to us accepting a job for a wage that’s too low or buying a product for a price that ridiculously high.

36 Fundamental Attribution Error: “…the tendency to overestimate individuals’ influence and underestimate external, situational factors.” Sally probably didn’t lie on her timecard because she is a genuinely malicious person who’s cheating the system, but because she was in a difficult circumstance where she didn’t know what to do. Hitler probably didn’t single handedly cause WW II, it was more likely due to a variety of different factors.

37 False Causality: People sometimes make false conclusions about what caused events to occur. More firefighters didn’t cause the fire to do more damage – more firefighters were called because the fire was doing more damage. Correlations does not imply causation.

38 Halo Effect: One quality of something affects how we view the entire thing. We take advice from a pro athlete endorsing a microwave but they aren’t experts in microwaves. We think a successful CEO at one company will also be successful at other companies. Attractive people are viewed as more trustworthy.

39 Alternative Paths: The other outcomes in life that could have happened but didn’t. There are opportunity costs and risks associated with the other paths. These paths aren’t visible so we don’t think about them. Don’t forget that some paths could lead to ruin.

40 Forecast Illusion: Humans are bad at making predictions. When you hear an expert give a prediction question what incentives they might have for making it.

41 Conjunction Fallacy: The probability of two events happening together (in conjunction) is always less than or equal to only one of them happening. Sometimes people will choose a less likely option because it has a better story behind it. The lawyer Chris likes puppies. Is it more likely that Chris is a lawyer at a big firm or that Chris is a lawyer at a big firm that defends animal rights. Many people will chose the second option because it sounds nicer but it has more conditions making it less likely.

42 Framing: Messages will be received differently depending on how you communicate them (how you frame them). The pill will help 33% of people survive → Yay! The pill has a 66% chance that you will die → Oh no! But these are the same results.

43 Action Bias: Humans want to take action rather than do nothing about a situation. Especially if the situation is unknown or unusual. For example, young officers are more eager to jump into a dangerous altercation than the Senior officers. Senior officers give a calming presence which leads to fewer casualties.

44 Omission Bias: Choosing inaction seems better than if we took action and it lead to a bad result. If we push a man in front of a car to stop it we will feel worse than if we had just let the car run into the school bus of children because the driver was texting. “If you’re not part of the solution, you’re part of the problem.”

45 Self-Serving Bias: “We attribute success to ourselves and failures to external factors.”

46 Hedonic Treadmill: When something makes us happy we will eventually go back to normal and when something makes us sad we will eventually go back to normal. The duration and intensity of emotions eventually fades.

47 Self-Selection Bias: This bias occurs when people are allowed to choose what group they are in. For example, a man may complain that there are too few women in his company. However, the probability is high that the man will work in a place that a lot of men like to work. He is in that group, not a different one.

48 Association Bias: People’s perceptions of something are influenced by what it’s associated with. We think Coca Cola is great because a bikini model is drinking it. We condemn bearers of bad news.

49 Beginner’s Luck: Initial success can make people take more risks even though the success might really be attributed to chance. They may even think they are just special. Eventually the probabilities will normalize.

50 Cognitive Dissonance: Mental discomfort that’s felt when you have an internal conflict. After not getting the job you convince yourself you didn’t want it anyways. When you pledge that you like charity and then someone asks you for a donation, you are more likely to comply.

51 Hyperbolic Discounting: People tend to discount things that are far away more than ones that are closer. People prefer instant gratification over delayed gratification. Most people would rather have $1000 today than $1100 in a month.

52 “Because” Justification: Justifying your behavior will make people more compliant even if it’s a poor justification.

53 Decision Fatigue: Making decisions drains your willpower and energy. Try to reduce your decisions or make important decisions earlier in the day.

54 Contagion Bias: We tend to feel a connection between an item and the thing that used it. You wouldn’t want to wear a perfectly good sweater if it was used by Hitler.

55 The Problem With Averages: An outlier can have a dramatic affect on the average value. Averages can hide the real distribution. A river with an average depth of 3 ft can be mostly very shallow but very very deep at one stretch.

56 Motivation Crowding: When someone does a good deed out of good will, if you offer compensation afterwards then it negates the it negates the good feeling. The monetary reward crowded out the previous motivation of doing something out of the goodness of their heart.

57 Twaddle Tendency: When people ramble or use overly sophisticated words in order to hide their laziness, stupidity, or mediocre ideas.

58 Will Rogers Phenomenon: By redistributing the members of different groups, you can raise both of their averages. If you take the “worst” performer out of a group of high-achievers and add them to a group of low-achievers, then both the group’s average performances will raise.

59 Information Bias: Trying to get more information may make the situation worse. You could spend too much time collecting information or you might face analysis paralysis.

60 Effort Justification: After people put a lot of effort into a task, they tend to overvalue its results. Fraternities and gangs have difficult initiations so that you feel greater pride of being a member.

61 The Law of Small Numbers: Small sample sizes can have a lot of fluctuation. A start-up only has a few people so the average IQ could be really high. A large corporation is likely to have an average IQ that’s more typical. Be careful when hearing amazing statistics about small groups.

Craig Ballantynexpectations: Higher expectations of people tend to result in higher performance from them.

63 Simple Logic: People tend to think with as little energy as possible which leads to going with intuition. However, the intuitive (simple) approach is not always correct. Be careful with assumptions and with what pops into your head.

64 Forer Effect: People tend to overly identify with descriptions of them that are actually very universal. For example, psychic readings make statements that tend to be generic and flattering which makes us want to accept them.

65 Volunteer’s Folly: Volunteering makes you feel great but it’s probably better for you to hire a professional than to do a mediocre job yourself. Volunteering is only useful if others can make use of your expertise. Otherwise you should save some money and donate it.

66 Affect Heuristic: People use “heuristics” to make quick decisions. And “affect” is how much you like or dislike something. The “affect heuristic” is people’s tendency to use how much they like someone as a quick guide to how much they trust someone even though your emotional feelings towards them isn’t necessarily a good factor.

67 Introspection Illusion: People tend to have overconfidence in their conclusions from self-reflection and introspection. We think they lead to truths and we reject people that don’t share the same beliefs. However we must realize that our conclusions are often contrived and subject to our internal biases.

68 Inability to Close Doors: People try in vain to keep as many options available to them in the future. However, this can lead to analysis paralysis and not making any progress.

69 Neomania: People tend to think that new is better. However we should not forget about the things that have withstood the test of time. “Whatever has survived for X years will last another X years.”

70 Sleeper Effect: Over time people tend to forget the source of the argument and only remember the argument itself. Therefore, overtime we forget that the source was untrustworthy which makes the argument gain more credibility.

71 Alternative Blindness: People tend to forget that there are many other alternatives. For example, you are not stuck between two options “MBA degree” or no “MBA degree”. You could instead take an online course that has a certification.

72 Social Comparison Bias: People often don’t want to help others that could outperform them. However this could be detrimental. If everyone in your company hires people that are worse than them then you’ll end up with a company of Bozos. If you have smarter people around then you can benefit and learn from all their contributions in the long term.

73 Primacy and Recency Effects: People tend to favor first impressions or initial information. We weight everything else against the initial information. People tend to favor information that is more recent and available. This information is what is readily available in our minds so we use it more.

74 Not-Invented-Here Syndrome: People tend to favor ideas that they come up with themselves even if other’s ideas are objectively better. Don’t overlook the ideas of others even if the other people are very different from you.

75 The Black Swan: “A Black Swan is an unthinkable event that massively affects your life, your career, your company, your country. There are positive and negative Black Swans.” They key point is that they are unthinkable (unknown). Life is unpredictable. Do your best to plan for your plans not going according to plan. (ex: Keep an emergency fund.)

76 Domain Dependence: Some things need to be in their specific domain in order to functional properly. For example, an all star CEO in a tech company might not be one in an agriculture company. An insight from biology might not apply well to finances.

77 False-Consensus Effect: “We frequently overestimate unanimity with others, believing that everyone else thinks and feels exactly like we do.” An inventor of a product might believe that customers will think the product is great too. Question your own assumptions.

78 Falsification of History: People tend to subconsciously adjust past views to fit present ones. Thus we think we were right all along. This is a way our brain helps us cope with the embarrassment of being wrong and admitting mistakes.

79 In-Group Out-Group Bias: People can form groups based on insignificant criteria like a shared football team. People can also form groups based on common values and views. People view others outside their own group as being more similar than they really are (us vs them). People tend to favor their own group.

80 Ambiguity Aversion: People prefer known probabilities (risk) over unknown ones (uncertainty). However, recognize that there are only very few places where there is risk without uncertainty (coin tosses, rolling dice, casinos, probability classes)

81 Default Effect: People are more likely to go with whatever the default option is or just follow the “status-quo”. Making the default Uber driver tip higher can increase the tips customers give.

82 Fear of Regret: People are afraid of regretting choices they make. This can lead to them holding onto things they don’t require or analysis paralysis of choices.

83 Salience Effect: People tend to attribute things too much to features that stand out (salient features). For example, a news reporter may claim the person got the job just because she is a woman. Being a woman stood out because most of the other employees were men.

84 House-Money Effect: People treat money differently depending on how they acquired it. If people work hard and earn $20,000 they are likely to invest it. However, if they won a lottery and got $20,000 they are more likely to spend it. “We treat money that we win, discover, or inherit much more frivolously than hard-earned cash.”

85 Procrastination: Delaying tasks that are unpleasant but important. It’s idiotic because they are important and they won’t complete themselves. Use artificial deadlines to fight procrastination.

86 Envy: People act irrationally when they are envious of another – refusing to help, sabotaging plans, secretly being happy when they fail, etc. Envy is stupid because there is nothing gained from it. Envy is actually flattering to the person you envy so much.

87 Personification: People are more influenced when stories showcase human qualities in a situations. Try to focus on the objective facts, not the story around it. People are more likely to donate if they see an image of a starving child than if they hear statistics about poverty in a country.

88 Illusion of Attention: “We are confident that we notice everything that takes place in front of us. But in reality, we often see only what we are focusing on”. Pay attention to silences as much as noise. Think about impossible scenarios.

89 Strategic Misrepresentation: People often misrepresent themselves in order to appear better and the more extreme the scenario is the more they will misrepresent themselves. An average woman will put on more makeup than a beautiful one. When your job interviewer asks you if you can do a crazy job you say, “Consider it done”.

90 Overthinking: Thinking about all the details can be crucial but it takes a long time. When you’re still a novice it’s probably wise to think about all the details. However, when you get better at it you can think less in order to move faster. Overthinking can lead to analysis paralysis.

91 Planning Fallacy: People tend to overestimate the amount they can get done in their plans. Groups especially, “…overestimate duration and benefits and systematically underestimate costs and risks.”

92 Déformation Professionnelle: An individual learns a set of ways of looking at the world. They apply those techniques to every situation. However, there may be better techniques for the situation that they are unaware of. It’s dangerous when people try to apply their techniques in areas where they don’t work well. You should strive to get many mental models for understanding the world.

93 Zeigarnik Effect: Uncompleted tasks in our heads pester us until we give them attention by completing them or writing them down. Try carrying around a notepad and writing down everything you have to do that pops into your mind. This will stop the pestering and help clear your mind.

94 Illusion of Skill: A person that is skilled in one area may not be skilled in another. A successful CEO in the tech world may not be successful in the real-estate world.

95 Feature-Positive Effect: People are bad at thinking about the things that are not present. An unhealthy salad dressing lists all the vitamins on the cover so we think it’s healthy. We don’t think to question the cholesterol level since it’s not listed.

96 Cherry Picking: “showcasing the most attractive features and hiding the rest.”

97 Fallacy of the Single Cause: Many situations are complex and chaotic so there are many reasons that caused an event to happen. However, humans like simple stories and casting blame so single reasons are often chosen.

98 Intention-to-Treat Error: When we accidentally place things that don’t pass a test into the wrong category and draw a bad conclusion. The drivers showed up after 1 hour so they must be a slow driver – actually they got in a car accident. Patients that regularly took the pill improved so the pill is great – actually many patients took the pill irregularly because of its horrible side effects.

99 News Illusion: “News is to the mind what sugar is to the body: appetizing, easy to digest—and highly destructive in the long run.” Many people think they should read news to become better and to understand the world. In reality news is biased, scares us, is irrelevant, and is a big waste of time. Read books instead of the news.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Related Posts

2024 March

Hey! I’ve decided to shift the focus of these newsletters from teaching machine learning to sharing interesting concepts, personal insights, engaging articles, and more resources

Read More »

2024 February Newsletter

Hey! MY QUICK UPDATES THOUGHTS, INSIGHTS, AND ADVICE RANDOM LEARNINGS, KNOWLEDGE, AND REMINDERS MY NEW CONTENT Michael Hammer POPULAR CONTENT Feel free to respond to

Read More »