The concept of a ‘narrative fallacy’ was introduced by Nassim Taleb and describes how flawed stories of the past influence our current views and future expectations. An explanation is considered more appealing if it’s concrete, assigns a significant role to talent, intentions or ignorance (instead of luck) and focuses on a few conspicuous events that happened than on numerous events that did not happen.
People are prone to interpret someone’s behavior as a reflection of personality traits and general propensities, which are easy to match to effects. The halo effect contributes to coherence: our judgement of one significant attribute influences how we view all qualities. If you consider a soccer player to be strong and attractive, you are likely to think of him as an excellent player as well. If you find him unattractive, you will probably underrate his soccer skills. The halo effect exaggerates the consistency of judgments: bad people are all bad and nice people do only nice things. Reading ‘Hitler liked cats and toddlers” causes a shock, because such a bad person having a good side violates our expectations.
When you read a story about the founders of a highly successful company, with almost every choice they made having a good outcome, you get the sense that you understand what made the company succeed. You get the feeling that you learned what it takes to found successful companies. It is, however, very likely that your sense of understanding and learning from the story is mostly illusory. An explanation can be tested by determining whether it would have made the event predictable in advance. The story about the successful company won’t meet that test, because no story can include all the events that would have caused a divergent outcome. Our minds can’t handle events that did not happen. The fact that most significant events involved choices makes you exaggerate the role of skill and underestimate the influence of luck. Although the founders were skilled, luck had a big influence on the great outcome. This demonstrates the power of the WYSIATI-rule. You deal with the restricted information you received as if it were all there is to know. You construct the best possible story from the available information and if it’s a nice one, you believe it. The less you know, the easier it is to form a coherent story.
People saying “I knew well before the economic crisis happened that it was inevitable” are wrong, because they thought it would happen, they did not ‘know’ it. They afterwards say ‘knew’ because it did happen.
It is an illusion to believe that we understand the past, because we understand it less than we believe we do. The words ‘know’, ‘premonition’ and ‘intuition’ refer to past thoughts that turned out to be true. They need to be avoided in order to think clearly about future events.
What are the costs of hindsight?
Our mind is like a sense-maker. When something unpredicted happens, you instantly revise your view of the world so the surprise fits in. Learning from surprises seems sensible, but there can be dangerous consequences. Our mind is limited by its flawed ability to reconstruct beliefs that have changed or past states of knowledge. As soon as you adjust your view of the world, you are not able to recall your past belief. Instead of reconstructing what they used to believe, people retrieve their current belief (substitution) and most people cannot believe they ever had another belief. Not being able to reconstruct former beliefs causes the underestimation of the extent to which we were surprised by past events. This is called the ‘hindsight bias’ or the ‘l-knew-it-all-along’ effect.
In an experiment, participants were asked to assign probabilities to a number of possible outcomes. After the event occurred, they were asked to recall their previous answers. They exaggerated their answers if the event had occurred and recalled the events that did not occur as always being unlikely. Other studies also demonstrate how we tend to revise our past beliefs in light of what actually occurred, which generates a cognitive illusion. Hindsight bias negatively affects the evaluations of decision makers. The quality of decisions should be assessed by whether the process was right, not by whether the outcome was right. Imagine a low-risk surgery going wrong due to an unpredictable accident. People are afterwards likely to believe that it actually was a risky surgery and the decision of the doctor to order it was wrong. This is an example of the outcome bias, which makes it very hard to properly evaluate a decision.
Hindsight is particularly troubling for people who make decisions for others, like financial advisers, politicians or physicians. When the outcome is bad, clients usually blame them for failing to see it coming, although the signs only became clear afterwards. Decision makers who fear having their decisions scrutinized in hindsight tend to change their procedures, which leads to bureaucracy and increased social costs. Physicians order more tests, refer more people to specialists and apply treatments that probably won’t work. Hindsight and the outcome bias can also result into rewarding irresponsible decision makers who got lucky but took a lot of risk.
What are the recipes for success?
System 1’s habit of trying to make sense of things makes us view the world as more simple, coherent, tidy and predictable than it actually is. The illusion that we understand the past induces the illusion that we are capable of predicting and controlling the future. They makes us feel comfortable, as acknowledging the uncertainty of our existence would make us anxious.
Managers and leaders influence the outcomes of their businesses, but the impact of management practices and leadership style on success are often exaggerated in success stories. If you ask business experts what they think about the reputation of a CEO, their knowledge about the business doing well or poorly produces a halo. The CEO of a profitable company will be praised, but one year later things go south: the same CEO will be reviewed negatively. While both reviews seem correct at the time, it is weird to say contradicting things about the same person (first decisive, then confused). This illustrates the power of the halo effect. It also results into an backward causal relationship: we tend to believe that the business fails because the leader is confused, but the opposite is true: the leader appears confused because the business is doing poorly.
The combination of the outcome bias and the halo effect explains the popularity of books with titles like ‘how to build a successful business’. Key message of these books is that good management practices will be rewarded with profit. The difference between a successful company and a less successful company is often not great leadership but luck. Even if you are convinced that the leader is extremely competent and visionary, you would not be able to predict the performance of the company. The average gap between compared successful and less successful companies shrank over time, most likely because the original gap was due to luck (regression to the mean).
Summary per chapter with the 1st edition of Thinking, Fast and Slow by Kahneman
- What is the book about?
- Part 1: How do fast thinking and slow thinking work? Chapters 1-9
- Part 2: How do heuristics and biases work? Chapters 10-18
- Part 3: In what ways can you get overconfident? Chapters 19-24
- Part 4: How do you make choices and decisions? Chapters 25-34
- Part 5: What is the effect of fast and slow thinking on your experiences, choices and well-being? Chapters 35-38
- Related summaries and study notes with the 1st edition of Thinking,