This is a book summary of The Art of Thinking Clearly by Rolf Dobelli (Amazon).
đź”’ Premium members have access to the full summary of 98 (!) cognitive concepts: How to Think Better with “The Art of Thinking Clearly” by Rolf Dobelli (Full List of 98 Biases, Fallacies, & Misjudgments)
Here’s a visual intro to Rolf Dobelli and The Art of Thinking Clearly:
Quick Housekeeping:
- All content in quotation marks is from the author unless otherwise stated.
- I’ve organized content into my own themes/order vs the author’s chapters.
- I’ve added emphasis in bold for readability/skimmability.
Book Summary Contents:Â Click a link here to jump to a section below
- Action Bias
- Authority Bias
- Black Swan
- Confirmation Bias
- Effort Justification
- Endowment Effect
- False-Consensus Effect
- False Causality
- Forer Effect
- Fundamental Attribution Error
- Halo Effect
- Hindsight Bias
- Hyperbolic Discounting
- Incentive Super-Response Tendency
- Information Bias
- Introspection Illusion
- Loss Aversion
- Outcome Bias
- Primacy & Recency Effects
- Regression to Mean
- Single Cause Fallacy
- Sleeper Effect
- Story Bias
- Sunk Cost Fallacy
- Survivorship Bias
25 Cognitive Concepts from The Art of Thinking Clearly by Rolf Dobelli (Book Summary)
About the book The Art of Thinking Clearly
“My wish is quite simple: If we could learn to recognize and evade the biggest errors in thinking—in our private lives, at work, or in government—we might experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”
- “The failure to think clearly, or what experts call a ‘cognitive error,’ is a systematic deviation from logic—from optimal, rational, reasonable thought and behavior. By ‘systematic,’ I mean that these are not just occasional errors in judgment but rather routine mistakes, barriers to logic we stumble over time and again, repeating patterns through generations and through the centuries.”
- “This is not a how-to book. You won’t find ‘seven steps to an error-free life’ here. Cognitive errors are far too engrained to rid ourselves of them completely. Silencing them would require superhuman willpower, but that isn’t even a worthy goal. Not all cognitive errors are toxic, and some are even necessary for leading a good life. Although this book may not hold the key to happiness, at the very least it acts as insurance against too much self-induced unhappiness.”
- “We know with certainty what destroys success or happiness. This realization, as simple as it is, is fundamental: Negative knowledge (what not to do) is much more potent than positive knowledge (what to do). Thinking more clearly and acting more shrewdly means adopting Michelangelo’s method: Don’t focus on David. Instead, focus on everything that is not David and chisel it away. In our case: Eliminate all errors and better thinking will follow.”
1. Action Bias
- “In new or shaky circumstances, we feel compelled to do something, anything. Afterward we feel better, even if we have made things worse by acting too quickly or too often … The action bias causes us to offset a lack of clarity with futile hyperactivity and comes into play when a situation is fuzzy, muddy, or contradictory.”
2. Authority Bias
- “Whenever you are about to make a decision, think about which authority figures might be exerting an influence on your reasoning.”
3. Black Swan
- “‘All swans are white.’ For centuries, this statement was watertight. Every snowy specimen corroborated this. A swan in a different color? Unthinkable. That was until the year 1697, when Willem de Vlamingh saw a black swan for the first time during an expedition to Australia. Since then, black swans have become symbols of the improbable … A Black Swan is an unthinkable event that massively affects your life, your career, your company, your country. There are positive and negative Black Swans … Though we can continue to plan for the future, Black Swans often destroy our best-laid plans. Feedback loops and nonlinear influences interact and cause unexpected results.”
4. Confirmation Bias
- “The confirmation bias is the mother of all misconceptions. It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs, and convictions. In other words, we filter out any new information that contradicts our existing views (‘disconfirming evidence’) … To fight against the confirmation bias, try writing down your beliefs—whether in terms of worldview, investments, marriage, health care, diet, career strategies—and set out to find disconfirming evidence.”
5. Effort Justification
- “When you put a lot of energy into a task, you tend to overvalue the result … Effort justification is a special case of cognitive dissonance … Whenever you have invested a lot of time and effort into something, stand back and examine the result—only the result.”
6. Endowment Effect
- “We consider things to be more valuable the moment we own them. In other words, if we are selling something, we charge more for it than what we ourselves would be willing to spend … Don’t cling to things. Consider your property something that the ‘universe’ (whatever you believe this to be) has bestowed to you temporarily. Keep in mind that it can recoup this (or more) in the blink of an eye.”
7. False-Consensus Effect
- “We frequently overestimate unanimity with others, believing that everyone else thinks and feels exactly like we do. This fallacy is called the false-consensus effect … Assume that your worldview is not borne by the public. More than that: Do not assume that those who think differently are idiots. Before you distrust them, question your own assumptions.”
8. False Causality
- “Correlation is not causality. Take a closer look at linked events: Sometimes what is presented as the cause turns out to be the effect, and vice versa. And sometimes there is no link at all.”
9. Forer Effect
- “People tend to identify many of their own traits in universal descriptions. Science labels this tendency the Forer effect (or the Barnum effect). The Forer effect explains why the pseudosciences work so well—astrology, astrotherapy, the study of handwriting, biorhythm analysis, palmistry, tarot card readings, and sĂ©ances with the dead.”
10. Fundamental Attribution Error
- “The tendency to overestimate individuals’ influence and underestimate external, situational factors … As much as we are fascinated with the spectacle of life, the people onstage are not perfect, self-governed individuals. Instead, they tumble from situation to situation. If you want to understand the current play—really understand it—then forget about the performers. Pay close attention to the dance of influences to which the actors are subjected.”
11. Halo Effect
- “The halo effect occurs when a single aspect dazzles us and affects how we see the full picture … We take a simple-to-obtain or remarkable fact or detail, and extrapolate conclusions from there that are harder to nail down … The psychologist Edward Lee Thorndike discovered the halo effect nearly one hundred years ago. His conclusion was that a single quality (e.g., beauty, social status, age) produces a positive or negative impression that outshines everything else, and the overall effect is disproportionate.”
12. Hindsight Bias
- “The hindsight bias is one of the most prevailing fallacies of all. We can aptly describe it as the ‘I told you so’ phenomenon: In retrospect, everything seems clear and inevitable … It makes us believe we are better predictors than we actually are, causing us to be arrogant about our knowledge and consequently to take too much risk.”
13. Hyperbolic Discounting
- “The closer a reward is, the higher our ’emotional interest rate’ rises and the more we are willing to give up in exchange for it … Hyperbolic discounting, the fact that immediacy magnetizes us … Though instantaneous reward is incredibly tempting, hyperbolic discounting is still a flaw. The more power we gain over our impulses, the better we can avoid this trap.”
14. Incentive Super-Response Tendency
- “People respond to incentives by doing what is in their best interests. What is noteworthy is, first, how quickly and radically people’s behavior changes when incentives come into play or are altered, and second, the fact that people respond to the incentives themselves, and not the grander intentions behind them … If a person’s or an organization’s behavior confounds you, ask yourself what incentive might lie behind it. I guarantee you that you’ll be able to explain 90 percent of the cases this way. What makes up the remaining 10 percent? Passion, idiocy, psychosis, or malice.”
15. Information Bias
- “The delusion that more information guarantees better decisions … Forget trying to amass all the data. Do your best to get by with the bare facts. It will help you make better decisions. Superfluous knowledge is worthless, whether you know it or not.”
16. Introspection Illusion
- “Introspection is not reliable. When we soul-search, we contrive the findings … The belief that reflection leads to truth or accuracy is called the introspection illusion … Nothing is more convincing than your own beliefs. We believe that introspection unearths genuine self-knowledge. Unfortunately, introspection is, in large part, fabrication posing two dangers: First, the introspection illusion creates inaccurate predictions of future mental states. Trust your internal observations too much and too long, and you might be in for a very rude awakening. Second, we believe that our introspections are more reliable than those of others, which creates an illusion of superiority.”
17. Loss Aversion
- “We fear loss more than we value gain … In fact, it has been proven that, emotionally, a loss ‘weighs’ about twice that of a similar gain … The fear of losing something motivates people more than the prospect of gaining something of equal value.”
18. Outcome Bias
- “We tend to evaluate decisions based on the result rather than on the decision process. This fallacy is also known as the historian error … Never judge a decision purely by its result, especially when randomness and ‘external factors’ play a role. A bad result does not automatically indicate a bad decision and vice versa. So rather than tearing your hair out about a wrong decision, or applauding yourself for one that may have only coincidentally led to success, remember why you chose what you did. Were your reasons rational and understandable? Then you would do well to stick with that method, even if you didn’t strike it lucky last time.”
19. Primacy & Recency Effects
- “The first traits outshine the rest. This is called the primacy effect … The contrasting recency effect matters as well. The more recent the information, the better we remember it … If you have to make an immediate decision based on a series of ‘impressions’ (such as characteristics, exam answers, etc.), the primacy effect weighs heavier. But if the series of impressions was formed some time ago, the recency effect dominates … First and last impressions dominate, meaning the content sandwiched between has only a weak influence.”
20. Regression to Mean
- “Extreme performances are interspersed with less extreme ones … When you hear stories such as: ‘I was sick, went to the doctor, and got better a few days later’ or ‘the company had a bad year, so we got a consultant in, and now the results are back to normal,’ look out for our old friend, the regression-to-mean error.”
21. Single Cause Fallacy
- “Any clear-thinking person knows that no single factor leads to such events. Rather, there are hundreds, thousands, an infinite number of factors that add up. Still, we keep trying to pin the blame on just one … The fallacy of the single cause is as ancient as it is dangerous. We have learned to see people as the ‘masters of their own destinies.’ Aristotle proclaimed this 2,500 years ago. Today we know that it is wrong. The notion of free will is up for debate. Our actions are brought about by the interaction of thousands of factors—from genetic predisposition to upbringing, from education to the concentration of hormones between individual brain cells. Still we hold firmly to the old image of self-governance. This is not only wrong but also morally questionable. As long as we believe in singular reasons, we will always be able to trace triumphs or disasters back to individuals and stamp them ‘responsible.’ The idiotic hunt for a scapegoat goes hand in hand with the exercise of power—a game that people have been playing for thousands of years.”
22. Sleeper Effect
- “In our memories, the source of the argument fades faster than the argument. In other words, your brain quickly forgets where the information came from (e.g., from the Department of Propaganda). Meanwhile, the message itself (i.e., war is necessary and noble) fades only slowly or even endures. Therefore, any knowledge that stems from an untrustworthy source gains credibility over time. The discrediting force melts away faster than the message does.”
23. Story Bias
- “Stories are dubious entities. They simplify and distort reality and filter things that don’t fit … From our own life stories to global events, we shape everything into meaningful stories. Doing so distorts reality and affects the quality of our decisions … Whenever you hear a story, ask yourself: Who is the sender, what are his intentions, and what did he hide under the rug? The omitted elements might not be of relevance. But, then again, they might be even more relevant than the elements featured in the story … The real issue with stories: They give us a false sense of understanding, which inevitably leads us to take bigger risks and urges us to take a stroll on thin ice.”
24. Sunk Cost Fallacy
- “The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes … This irrational behavior is driven by a need for consistency. After all, consistency signifies credibility. We find contradictions abominable. If we decide to cancel a project halfway through, we create a contradiction: We admit that we once thought differently. Carrying on with a meaningless project delays this painful realization and keeps up appearances.”
25. Survivorship Bias
- “In daily life, because triumph is made more visible than failure, you systematically overestimate your chances of succeeding. As an outsider, you succumb to an illusion, and you mistake how minuscule the probability of success really is … Survivorship bias means this: People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.”
đź”’ Premium members have access to the full summary of 98 (!) cognitive concepts: How to Think Better with “The Art of Thinking Clearly” by Rolf Dobelli (Full List of 98 Biases, Fallacies, & Misjudgments)
You May Also Enjoy:
- See all book summaries.
Leave a Reply