#books ![[Pasted image 20221121131340.png]] A recurrent theme of this book is: >our mind is strongly biased toward causal explanations and does deal well with "mere statistics." p.182 ## Chapter 5: Cognitive Ease >The impression of familiarity is produced by System 1, and System 2 relies on that impression for a true/false judgment. > >Anything that makes it easier for the associative machine to run smoothly will also bias beliefs. > >...familiarity is not easily distinguished from truth. (p.62) ## Chapter 7: A Machine for Jumping to Conclusions >Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information. These are the circumstances in which intuitive errors are probable, which may be prevented by a deliberate intervention of System 2. (p.79) ### A Bias to Believe and Confirm Psychologist Daniel Gilbert proposed that: >...understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decided whether or not to *unbelieve* it. (p.81) >The operations of associative memory contribute to a general *confirmation bias.* >[...] >A deliberate search for confirming evidence, known as *positive test strategy*, is also how System 2 tests a hypothesis. (p.82) To combat confirmation bias we should always be seeking out [[Thinking Probabilistically|disconfirming evidence]]. ### Exaggerated Emotional Coherence (Halo Effect) >The tendency to like (or dislike) everything about a person - including things you have not observed - is known as the halo effect. >[...] >The halo effect is also an example of suppressed ambiguity: like the word *bank*, the adjective *stubborn* is ambiguous and will be interpreted in a way that makes it coherent with the context. (p.82) Bank can mean a financial institution or the bank of a river just like stubborn can be interpreted as a virtue, stubborn persistence, or a vice, stubbornly critical. >The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted. (p.84) *The Wisdom of Crowds*, by James Surowiecki demostrates how individuals can do a task very poorly, but pools of the same individuals can do the same task remarkably well. This is because the errors of each individual are independent of each other and over a large enough sample size, average to zero. The wisdom of the crowd is susceptible to the reflexivity of the markets: >Allowing the observers to influence each other effectively reduces the size of the sample, and with it the precision of the group estimate. (p.84) ### What You See is All There Is (WYSIATI) >System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions. >[...] >It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern. (p.86-87) ## Chapter 9: Answering An Easier Question >If a satisfactory answer to a hard questions is not found quickly, System 1 will find a related question that is easier and will answer it. (p.97) Kahneman terms this operation ***substitution***, and he goes on to point out that the technical definition of ***heuristic*** is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. [Adam Mastroianni: How to feel bad and be wrong](https://www.experimental-history.com/p/how-to-feel-bad-and-be-wrong) - [Attribute Substitution](https://en.wikipedia.org/wiki/Attribute_substitution) >...when called upon to judge probability, people actually judge something else and believe they have judged probability. System 1 often makes this move when faced with difficult target questions, if the answer to a related and easier heuristic question comes readily to mind. >p.98 Substitution can be a good strategy for solving problems when they are deliberately implemented by System 2 and not a knee jerk judgment by System 1. Many [[Problem-Solving]] techniques involve solving an easier version or subset of a more difficult problem. >The automatic processes of the mental shotgun and intensity matching often make available one or more answers to easy questions that could be mapped onto the target question. (p.99) This process works by System 1 answering an easier, heuristic, question instead of the target question and then System 2 endorsing that answer. However, System 2 retains the option to reject that questions however "a lazy System 2 often follows the path of least effort and endorses a heuristic answer without much scrutiny of whether it is truly appropriate." A survey of German students asked the following questions: 1. How happy are you these days? 2. How many dates did you have last month? The experimenters were interested in the correlation between the two answers but surprisingly found none in the study. However, when the questions were asked in reversed order the correlation increased dramatically. >The psychology of what happened is precisely analogous to the psychology of the size illusion in figure 9. "Happiness these days" is not a natural or easy assessment. A good answer requires a fair amount of thinking. However, the student who had just been asked about their dating did not need to think hard because they already had in their mind an answer to a related question: how happy they were with their love life. They substituted the question to which they had a ready-made answer for the question they were asked. (p.102) This provides us with opportunities to reframe questions in a way that will result in our audience giving us the answers we want. If we want positive responses to an abstract question we could ask an easier, question with a high probability of a favorable answer and our audience will likely substitute that when answering the second, more difficult question. ### The Affect Heuristic Paul Slovic has proposed this heuristic in which people let their likes and dislikes determine their beliefs about the world. If you believe a certain thing is good you will likely judge its outcomes favorably and vice versa. This does not mean we are blind to new information and changing beliefs but that we are biased by our beliefs. If you want a certain answer to a question it is better to frame it so that it is aligned with the beliefs of your audience. It is easier to get someone to agree to a course of action if you first explain how the benefits and the risks later and correlating those risks to things they believe are unlikely. ## Chapter 17: Regression to the Mean The main thrust of this chapter is one of the main themes of the book: "our mind is strongly biased toward causal explanations and does not deal well with 'mere statistics.'"(p.182) >Regression effects can be found wherever we look, but we do not recognize them for what they are. They hide in plain sight. p.180 The "Sports Illustrated *jinx*" is evoked as an example of how we assign a causal narrative to something is merely a statistical fact of life. >Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has explanation but does not have a cause. p.182 The most thought provoking part of the chapter centers on the work of Sir Francis Galton, a half cousin of Charles Darwin and renowned polymath, who struggled for years to understand statistical regularity "as common as the air we breathe." >It took Francis Galton several years to figure out that correlation and regression are not two concepts-they are different perspectives on the same concept. p.181 ## Chapter 19: The Illusion of Understanding [[The Black Swan#Chapter 6 The Narrative Fallacy]] >The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. (p.199) > A compelling narrative fosters an illusion of inevitability. (p.200) ![[Keep it Simple#^3e60d5]] > The ultimate test of an explanation is whether it would have made the event predictable in advance. (p.200) >You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. (p.201) In [Beware the Metagame](https://amasad.me/meta), Amjad Masad explains how the more abstract a field is, the easier it is to reason about. >The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do. *Know* is not the only word that fosters this illusion. In common usage, the words *intuition* and *premonition* also are reserved for past thoughts that turned out to be true. >[...] >To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past. (p.202) **Hindsight Bias** >Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events. (p.202) >It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. (p.203) How hindsight bias reduces our willingness to take [[risk]]: >Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions - and to an extreme reluctance to take risks. (p.204) **Illusions are comforting** >These illusions are comforting. They reduce the anxiety we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. (p.205) **On being suspicious of highly consistent patterns** >Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparisons of successful and less successful firms. *In the presence of randomness, regular patterns can only be mirages.* (p.207) **Case Studies** - [[CHG Issue 130 Discerning Expertise]] ## Chapter 22 Expert Intuition: When Can We Trust It? Naturalistic Decision Making (NDM): - "They are deeply skeptical about the value of using rigid algorithms to replace human judgment" (p.235) - Analyze "how experienced professionals develop intuitive skills" (p.235) - Malcolm Gladwell's Bestselling Book *Blink* >...stock pickers and political scientists who make long-term forecasts operate in a zero-validity environment. Their failures reflect the basic unpredictability of the events that they try to forecast. >p.240 ## Chapter 26 Prospect Theory Prospect Theory is probably what Kahnemen and his partner Amos Tversky are most famous for. **Case Studies** - [[CHG Issue 165 The Road to Hell is Paved with Good Intentions]]