# The Scout Mindset ![rw-book-cover](https://readwise-assets.s3.amazonaws.com/media/uploaded_book_covers/profile_3190/e246e673-07dd-45f0-a7bb-4956d1ee9ecd.png) ## Metadata - Author: [[Julia Galef]] - Full Title: The Scout Mindset - Category: #books - Notes: [[The Scout Mindset - Notes]] ## Highlights ### Introduction - …__[[scout mindset]]__: the motivation to see things as they are, not as you wish they were - Tags: [[definition]] - Knowing that you should test your assumptions doesn't automatically improve your judgment, any more than knowing you should exercise automatically improves your health. Being able to rattle off a list of biases and fallacies doesn't help you unless you're willing to acknowledge those biases and fallacies in your own thinking. The biggest lesson I learned is something that's since been corroborated by researchers, as we'll see in this book: __our judgment isn't limited by the knowledge nearly as much as it's limited by attitude__. - My approach has three prongs: 1. Realize that truth isn't in conflict with your other goals 2. Learn tools that make it easier to see clearly 3. Appreciate the emotional rewards of scout mindset #### 1. Realize That Truth Isn't in Conflict with Your Other Goals #### 2. Learn Tools That Make It Easier to See Clearly #### 3. Appreciate the Emotional Rewards of Scout Mindset - ...it's empowering to be able to resist the temptation to self-deceive, and to know that you can face reality even when it's unpleasant. ### Part I: The Case for Scout Mindset #### Chapter 1: Two Types of Thinking - …an aspect of human psychology called __directionally [[motivated reasoning]]__--or, more often, just __[[motivated reasoning]]__--in which our unconscious motives affect the conclusions we draw. The best description of [[motivated reasoning]] I've ever seen comes from Tom Gilovich. When we want something to be true, he said, we ask ourselves, "Can I believe this?," searching for an excuse to accept it. When we don't want something to be true, we instead ask ourselves, "Must I believe this?," searching for an excuse to reject it....It's all around you under different names--denial, wishful thinking, confirmation bias, rationalization, tribalism, self-justification, overconfidence, delusion. - Tags: [[definition]] - The tricky thing about [[motivated reasoning]] is that even though it's easy to spot in other people, it doesn't feel like [[motivated reasoning]] from the inside. - We talk about our beliefs as if they're military positions, or even fortresses, built to resist attack. Beliefs can be *deep-rooted*, *well-grounded*, *built on fact*, and *backed up* by arguments. They *rest on solid foundations*. We might hold a *firm* conviction or a *strong* opinion, be *secure* in our beliefs or have *unshakeable* faith in something. Arguments are either forms of attack or forms of defense. If we're not careful, someone might *poke holes* in our logic or *shoot down* our ideas. We might encounter a *knock-down* argument against something we believe. Our positions might get *challenged*, *destroyed*, *undermined*, or *weakened*. So we look for evidence to *support*, *bolster*, or *buttress* our position. Over time, our views become *reinforced*, *fortified*, and *cemented*. And we become *entrenched* in our beliefs, like soldiers holed up in a trench, safe from the enemy's volleys. And if we do change our minds? That's surrender. If a fact is *inescapable*, we might *admit*, *grant*, or *allow* it, as if we're letting it inside our walls. If we realize our position is *indefensible*, we might *abandon* it, *give it up*, or *concede* a point, as if we're ceding ground in a battle. Throughout the next few chapters, we'll learn more about [[motivated reasoning]], or as I call it, __[[soldier mindset]]__.... - Tags: [[definition]] - …__[[accuracy motivated reasoning]]__. In contrast to directionally [[motivated reasoning]], which evaluates ideas through the lenses of "Can I believe it?" and "Must I believe it?," [[accuracy motivated reasoning]] evaluates ideas through the lens of "Is it true?" - Tags: [[definition]] - If directionally [[motivated reasoning]] is like being a soldier fighting off threatening evidence, [[accuracy motivated reasoning]] is like being a scout forming a map of the strategic landscape. - Of course, all maps are imperfect simplifications of reality, as a scout well knows. Striving for an accurate map means being aware of the limits of your understanding, keeping track of the regions of your map that are especially sketchy or possibly wrong. And it means always being open to changing your mind in response to new information. In [[scout mindset]], there's no such thing as a "threat" to your beliefs. If you find out you were wrong about something, great--you've improved your map, and that can only help you. - "What are the most likely ways this could fail?" allows you to strengthen your plan against those possibilities in advance. - Tags: [[coaching]] - Being the kind of person who welcomes the truth, even if it's painful, is what makes other people willing to be honest with you. - We're all a mixture of scout and soldier. But some people, in some contexts, are better scouts than most. #### Chapter 2: What the Soldier Is Protecting - …when you advocate changing something, you should make sure you understand why it is the way it is in the first place. This rule is known as __[[Chesterton's fence]]__.... - Tags: [[definition]] - "What function does [[motivated reasoning]] serve?" I've broken it down into six overlapping categories: comfort, self-esteem, morale, persuasion, image, and belonging. ##### Comfort: Avoiding Unpleasant Emotions ##### Self Esteem: Feeling Good About Ourselves - …we often use [[soldier mindset]] to protect our egos by finding flattering narratives for unflattering facts. - Over time, our beliefs about the world adjust to accommodate our track record. - Your self-image shapes even your most fundamental beliefs about how the world works. Poorer people are more likely to believe that luck plays a big role in life, while wealthier people tend to credit hard work and talent alone. - Psychologists make a distinction between __[[self-enhancement]]__, which means boosting your ego with positive beliefs, and __[[self-protection]]__, which means avoiding blows to your ego. - Tags: [[definition]] - …Natalie Wynn calls it "__[[masochistic epistemology]]__"--whatever hurts is true. - Tags: [[definition]] ##### Morale: Motivating Ourselves to Do Hard Things - We need morale to make tough decisions and act on them with conviction. That's why decision-makers often avoid considering alternative plans or downsides to their current plan. - Comfort, self-esteem, and morale are *emotional* benefits, meaning that the ultimate target of our deception is ourselves. The next three benefits of [[soldier mindset]] are a little different. Persuasion, image, and belonging are *social* benefits--in these cases the target of our deception is other people, by way of ourselves. ##### Persuasion: Convincing Ourselves So We Can Convince Others ##### Image: Choosing Beliefs That Make Us Look Good - Psychologists call it __impression management__, and evolutionary psychologists call it __signaling__: When considering a claim, we implicitly ask ourselves, "What kind of person would believe a claim like this, and is that how I want other people to see me?" - Tags: [[definition]] ##### Belonging: Fitting in to Your Social Groups - To be clear, deferring to a consensus isn't inherently a sign of [[soldier mindset]]….Deferring to the consensus is often a wise heuristic, since you can't investigate everything for yourself, and other people know things you don't. What makes it [[motivated reasoning]] is when you wouldn't even want to find out if the consensus was wrong. - And in some groups, fitting in comes with restrictions on what you're allowed to want or to believe about yourself. It's been called __[[tall poppy syndrome]]__: anyone who seems like they're trying to be a "tall poppy," showing too much self-regard or ambition, is cut down to size. - Tags: [[definition]] - We use [[motivated reasoning]] not because wee don't know any better, but because we're trying to protect things that are vitally important to us…. #### Chapter 3 Why Truth Is More Valuable Than We Realize - This is one of the paradoxes of being human: that our beliefs serve such different purposes all at once, Invariably, we end up making trade-offs. (Page 29) - The hypothesis that the human mind evolved the ability to make these trade-offs well is called the __[[rational irrationality hypothesis]]__, coined by economist Bryan Caplan. If the name sounds like a paradox, that's because it's using two different senses of the word *rational*: epistemic rationality means holding beliefs that are well justified, while instrumental rationality means acting effectively to achieve your goals. Being rationally irrational, therefore, would mean that we're good at unconsciously choosing *just enough* epistemic irrationality to achieve our social and emotional goals, without impairing our judgment too much. (Page 31) - Tags: [[definition]] - No, we're far from rationally irrational There are several major biases in our decision-making, several ways in which we systematically misjudge the costs and benefits of truth. (Page 32) - The source of this self-sabotage is __[[present bias]]__, a feature of our intuitive decision-making in which we care too much about short-term consequences and too little about long-term consequences. (Page 32) - Tags: [[definition]] - It's widely known that [[present bias]] shapes our choices about how to act. What's much less appreciated is that it also shapes our choices about how to think….we reap the rewards of thinking in [[soldier mindset]] right away, while the costs don't come due until later. (Page 33) - …the benefit of an act of [[scout mindset]] isn't just about making your map of reality a bit more accurate. The benefit is in the habits and skills you're reinforcing….Every time you say, “Oh, that's a good point, I hadn't thought of that," it gets a little bit easier for you to acknowledge good points in general. Every time you opt to check a fact before citing it, you become a little bit more likely to remember to check your facts in general. Every time you're willing to say, “I was wrong," it gets a little bit easier to be wrong in general. (Page 34) - Note: See also _[[Atomic Habits]]_ book - Just like the lies we tell others, the lies we tell ourselves have ripple effects. (Page 35) - Perhaps in many instances the harm is negligible. But the fact that the harm is delayed and unpredictable should ring an alarm bell. This is exactly the kind of cost we tend to neglect when we're intuitively weighing costs and benefits. (Page 36) - Social costs like looking weird or making a fool out of ourselves feel a lot more significant than they actually are. In reality, other people aren't thinking about you nearly as much as you intuitively think they are, and their opinions of you don't have nearly as much impact on your life as it feels like they do. As a result, we end up making tragic trade-offs, sacrificing a lot of potential happiness to avoid relatively small social costs. (Page 36) - We’re overly tempted by immediate payoffs, even when they come at a steep cost later on. We underestimate the cumulative harm of false beliefs, and the cumulative benefit of practicing scout habits. We overestimate how much other people judge us, and how much impact their judgments have on our lives. As a result of all these tendencies, we end up being far too willing to sacrifice our ability to see clearly in exchange for short-term emotional and social rewards. That doesn't mean [[scout mindset]] is always the better choice—but it does mean we have a bias in favor of the soldier, *even when the scout is a better choice*. (Page 38) - Note: Chapter summary - This abundance of opportunity makes [[scout mindset]] far more useful than it would have been for our ancestors. After all, what's the point of admitting your problems exist if you can't fix them?… So if our instincts undervalue truth, that's not surprising—our instincts evolved in a different world, one better suited to the soldier. (Page 40) ### Part II: Developing Self-Awareness #### Chapter 4: Signs of a Scout - A key factor preventing us from being in [[scout mindset]] more frequently is our conviction that we're already in it. (Page 43) ##### Feeling Objective Doesn’t Make You a Scout - In fact, viewing yourself as rational can backfire. The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. (Page 44) ##### Being Smart and Knowledgeable Doesn’t Make You a Scout - A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. But they won't protect you from bias on ideologically charged questions. (Page 48) ##### 1. Do you tell other people when you realize they were right? - Technically, [[scout mindset]] only requires you to be able to acknowledge to yourself that you were wrong, not to other people. Still, a willingness to say "I was wrong" to someone else is a strong sign of a person who prizes the truth over their own ego. __Can you think of cases in which you've done the same?__ (Page 51) - Tags: [[reflection]] ##### 2. How do you react to personal criticism? - Are there examples of criticism you've acted upon? Have you rewarded a critic (for example, by promoting him)? Do you go out of your way to make it easier for other people to criticize you? (Page 52) - Tags: [[reflection]] ##### 3. Do you ever prove yourself wrong? - Can you think of any examples in which you voluntarily proved yourself wrong? (Page 54) - Tags: [[reflection]] ##### 4. Do you take precautions to avoid fooling yourself? - Do you try to avoid biasing the information you get? …do you describe disagreement without revealing which side you were on, so as to avcid influencing your friend's answer? When you launch a new project at work, do you decide ahead of time what will count as a success and what will count as a failure…? (Page 56) - Tags: [[reflection]] ##### 5. Do you have any good critics? - Can you name people who are critical of your beliefs, profession, or life choices who you consider thoughtful, even if you believe they're wrong? Or can you at least name reasons why someone might disagree with you that you would consider reasonable (even if you don't happen to know of specific people who hold those views)? (Page 57) - Tags: [[reflection]] - But the biggest sign of [[scout mindset]] may be this: __Can you point to occasions in which you were in [[soldier mindset]]?__ (Page 57) - Tags: [[reflection]] #### Chapter 5: Noticing Bias - One of the essential tools in a magician's tool kit is a form of manipulation called __[[forcing]]__. (Page 59) - Tags: [[definition]] - …here's an important tip to keep in mind when doinga thought experiment: Try to *actually imagine* the counterfactual scenario….Thought experiments only work if you actually do them. So don’t simply formulate a verbal question for yourself. Conjure up the counterfactual world, place yourself in it, and observe your reaction. (Page 62) - __[[double standard test]]__: “Am I judging other people's behavior by a standard I wouldn't apply to myself?" The [[double standard test]] can be applied to groups as well as individuals. (Page 63) - Tags: [[definition]] - __[[outsider test]]__: Imagine someone else stepped into your shoes—what do you expect they would do in your situation? (Page 65) - __[[conformity test]]__: Imagine this person told me that they no longer held this view. Would I still hold it? Would I feel comfortable defending it to them? (Page 67) - Tags: [[definition]] - __[[selective skeptic test]]__: Imagine this evidence supported the other side. How credible would you find it then? (Page 68) - Tags: [[definition]] - __[[status quo bias test]]__: Imagine your current situation was no longer the status quo. Would you then actively choose it? (Page 70) - Tags: [[definition]] - Note: flip the circumstances - What thought experiments do is simply reveal that your reasoning changes as your motivations change. (Page 71) #### Chapter 6: How Sure Are You? - To be fair, the certainty we express is partly just for simplicity's sake. Conversation would be unwieldy if we had to stop and assign a probability to every statement we made. But even when someone does prompt us to stop and reflect on our level of confidence, we often claim to be completely certain. (Page 74) - …a large portion of overconfidence stems from a desire to feel certain. (Page 75) - A scout treats their degree of certainty as a prediction of their likelihood of being right. (Page 75) - What you're implicitly aiming for when you tag your beliefs with various confidence levels is __[[perfect calibration]]__. That means your "50% sure" claims are in fact correct 50 percent of the time, your "60% sure" claims are correct 60 percent of the time, your "70% sure" claims are correct 70 percent of the time, and so on. (Page 76) - Tags: [[definition]] - Remember, the goal isn't to know as much as possible. It's to *know how much you know*. (Page 78) - Happily, calibration is a skill with a quick learning curve. A couple of hours of practice is all it takes for most people to become very well calibrated-at least within a single domain, like trivia questions. (Your calibration skill in one domain will carry over partially, but not completely, to other domains.) (Page 82) - The press secretary makes *claims*; the board makes *bets*. …A __[[bet]]__ is any decision in which you stand to gain or lose something of value, based on the outcome. That could include money, health, time—or reputation….So when you're thinking about how sure you are, your answer will be more honest if you switch from thinking in terms of "What can I get away with claiming to myself?" to __“How would I [[bet]], if there was something at stake?"__ (Page 83) - Tags: [[definition]] [[reflection]] - A tip when you're imagining betting on your beliefs: You may need to get more concrete about what you believe by coming up with a hypothetical test that could be performed to prove you right or wrong. For example, if you believe "Our computer servers are highly secure," a hypothetical test might be something like this: Suppose you were to hire a hacker to try to break in to your systems. If they succeed, you lose one month's salary. How confident do you feel that you would win that [[bet]]? (Page 84) - Tags: [[security 1]] - __[[equivalent bet test]]__ (Page 85) - Tags: [[definition]] - Note: You compare against different odds and see which one you would rather [[bet]] on, to help you hone in on the right odds. - There's a core skill in this chapter, too: being able to tell the difference between the feeling of *making a claim* and the feeling of *actually trying to guess what's true*. Making a claim feels like your press secretary is speaking. It feels pat; neat and tidy. Sometimes hurried, as if you're trying to put something past yourself. The mental motion is declaring, proclaiming, insisting, or perhaps scoffing. Trying to guess what’s true is like being the board of directors, deciding how to [[bet]]. There's at least a second or two when you don't know what answer you're going to end up giving. It's like you're squinting at the evidence, trying to summarize what you see. The mental motions involved are estimating, predicting, weighing, and deliberating. Quantifying (Page 87) - Note: Chapter summary ### Part III: Thriving Without Illusions #### Chapter 7: Coping with Reality - Scouts aren't invulnerable to fear, anxiety, insecurity, despair, or any of the other emotions that give rise to [[motivated reasoning]], and they rely on coping strategies just like anyone else. They just take more care to select coping strategies that don't mess with the accuracy of their judgment. (Page 95) ##### Make a plan - It's striking how much the urge to conclude "That's not true" diminishes once you feel like you have a concrete plan for what you would do if the thing *were* true. (Page 97) ##### Notice silver linings - A silver lining to any mistake is the lesson you're going to extract from the experience, which you can use to help save you from similar mistakes in the future. Remember, the goal isn't to convince yourself that your misfortune is actually a good thing, (Page 98) ##### Focus on a different goal - Rather than priding himself on being a great programmer, he decided to start priding himself on being an *astute judge of programming talent*. That was a satisfying enough substitute for the original goal, and actually helpful for hiring instead of counterproductive. (Page 99) ##### Things could be worse - One friend of mine copes with painful criticism by conjuring up a feeling of gratitude toward his critic….I cope by focusing on how much better I’m going to be in the future of I can get myself to think honestly about the criticism. (Page 103) #### Chapter 8: Motivation Without Self-Deception - Of course, any given individual may have a better or worse chance of success than the overall odds suggest, depending on how talented, hardworking, charismatic, or well connected they are. But the overall odds are an important baseline to be aware of; the longer the odds, the better and luckier you'll have to be to beat them. (Page 107) - This is the biggest problem with the self-belief approach to motivation. Because you're not supposed to think realistically about risk, it becomes impossidle to ask yourselt questions like, "Is this goal desirable enough to be worth the risk and "Are there any other goals that would be similarly desirable but require less risk?" It implicitly assumes that you don't need to make any decisions; that you've already found the one right path, and there are no other options out there worth weighing. (Page 108) - The reality is that there's no clear divide between the "decisionmaking" and "execution" stages of pursuing a goal. Over time, your situation will change, or you'll learn new information, and you'll need to revise your estimate of the odds. (Page 110) - …scouts aren't motivated by the thought, "This is going to succeed." They're motivated by the thought, "This is a [[bet]] worth taking." (Page 112) - Another way to think about whether a [[bet]] is positive expected value is to imagine taking it many times. Would the value of the expected successes outweigh the value of the expected failures?…In reality, you almost never get to repeat the exact same [[bet]] many times. But you'll have the opportunity to make many different bets over the course of your life. (Page 115) - But as long as you continue making positive expected value bets, that variance will mostly wash out in the long run. Building that variance into your expectations has the nice side effect of giving you equanimity. Instead of being elated when your bets pay off, and crushed when they don't, your emotions will be tied to the trend line underneath the variance. (Page 117) - "You want to get into a mental state where if the bad outcome comes to pass, you will only nod your head and say 'I knew this card was in the deck, and I knew the odds, and I would make the same bets again, given the same opportunities." (Page 119) - Note: Source: Nat Soares #### Chapter 9: Influence Without Overconfidence - Confidence is one of those words that we use to mean different things without even realizing it. One is __[[epistemic confidence]]__, or certainty—how sure you are about what's true.…If you say, “I'm 99 percent sure he is lying," or guarantee this will work,“ or “There's no way the Republicans can in" you're displaying a lot of [[epistemic confidence]]. Separately, there's __[[social confidence]]__, or self-assurance: Are you at ease in social situations? Do you act like you deserve to be there, like you're secure in yourself and your role in the group? Do you speak as if you're worth listening to? (Page 122) - Tags: [[definition]] - …when it comes to the impression you make on other people, being self-assured is more important than expressing certainty… (Page 125) - People sometimes bemoan the fact that "superficial" things like posture and voice make such a difference in how we judge each other. But on the bright side, that means that projecting competence doesn't require self-deception. You can boost your [[social confidence]] through practice speaking up in groups, hiring a speech coach, dressing better, improving your posture—all without compromising your ability to see things clearly. (Page 126) - When people claim that "admitting uncertainty" makes you look bad, they're invariably conflating these two very different kinds of uncertainty: uncertainty "in you," caused by your own ignorance or lack of experience, and uncertainty "in the world," caused by the fact that reality is messy and unpredictable. The former is often taken as a bad sign about someone's expertise, and justifiably so. But the latter is not—especially if you follow three rules for communicating uncertainty: (Page 128) ##### 1. Show that uncertainty is justified - Sometimes your audience won't be aware of how much uncertainty exists "in the world" on the topic you're speaking about, and they'll expect you to give answers with more certainty than is actually possible. That's okay; you just need to set their expectations. (Page 129) - In fact, if you show that certainty is unrealistic, you can be more persuasive than someone who states everything with 100 percent certainty. (Page 129) ##### 2. Give informed estimates - Give informed estimates and explain where they came from.…Even if reality is messy and it's impossible to know the right answer with confidence, you can at least be confident in your analysis. (Page 130) ##### 3. Have a plan - One reason people don't like hearing uncertain answers is that it leaves them at a loss for how to act. You can reassure them by following up your uncertainty with a plan or recommendation….having a plan might involve designing a test to pin down some crucial factor with more precision, or proposing a multi-phase plan to allow for occasional reevaluation. (Page 131) - You can set ambitious goals. You can paint a vivid picture of the world you want to create. You can speak from the heart about why you personally care about this issue. (Page 132) - First, you don't need to hold your opinions with 100 percent certainty in order to seem confident and competent. People simply aren't paying that much [[attention]] to how much [[epistemic confidence]] you express. They're paying [[attention]] to how you act, to your body language, tone, and other aspects of your [[social confidence]], all of which are things you can cultivate without sacrificing your calibration. Second, expressing uncertainty isn't necessarily a bad thing. It depends on whether the uncertainty is "in you" or "in the world." If you can demonstrate a strong command of the topic and speak with ease about your analysis and your plan, you'll seem like more of an expert, not less. Third, you can be inspiring without overpromising. You can paint a picture of the world you're trying to create, or why your mission is important, or how your product has helped people, without claiming you're guaranteed to succeed. There are lots of ways to get people excited that don't require you to lie to others or to yourself. (Page 133) - Note: Chapter summary - There are lots of ways to change the game board you're playing on so that you end up wíth better choices, instead of simply resigning yourself to pícking the least-bad choice currently in front of you. (Page 134) ### Part IV: Changing Your Mind #### Chapter 10: How to Be Wrong - What made the superforecasters so great at being right wre they were great at being wrong….The superforecasters changed their minds all the time. Not dramatic, 180-degree reversals every day, but subtle revisions as they learned new information. (Page 138) - If instead you see the world in shades of gray, and you think of “changing your mind" as an incremental shift, then the experience of encountering evidence against one of your beliefs is very different….each adjustment is comparatively low stakes. (Page 140) - …they would go back and reevaluate their process, asking, "What does this teach me about how to make better forecasts?" …This is another reason superforecasters are much happier to thỉnk about what they got wrong—they know that analyzing their errors is an opportunity to hone their technique. (Page 142) - …one of the biggest benefits of noticing your errors: the opportunity to improve your judgment in general. (Page 143) - They're __[[domain-general]]__, meaning that they apply to a wide variety of different domains, as opposed to __[[domain-specific]]__ lessons that apply only to a single domain… (Page 143) - Tags: [[definition]] - If it seems like someone is saying something dumb, I might be misunderstanding them. (Page 144) - Tags: [[reflection]] - You might think these principles sound obvious and that you know them already. But “knowing" a principle, in the sense that you read it and say, "Yes, I know that," is different from having internalized it in a way that actually changes how you think….But such knowledge doesn't really become part of you until you’ve derived it for yourself by going through the experience of realizing you were wrong, asking yourself why, and seeing the effect of the bias at work. (Page 144) - …we’ve explored two ways in which scouts think about error differently from most people. First, they revise their opinions incrementally over time, which makes it easier to be open to evidence against their beliefs. Second, they view errors as opportunities to hone their skill at getting things right, which makes the experience of realizing "I was wrong" feel valuable, rather than just painful. (Page 145) - Note: Chapter summary so far - You've learned new information ard come to a new conclusion, but that doesn't mean you were wrong to believe differently in the past. The only reason to be contrite is if vou were negligent in some way. (Page 146) - But most of the time, *being* wrong doesn't mean you *did* something wrong. It's not something you need to apologize for, and the appropriate attitude to have about it is neither defensive nor humbly self-flagellating, but matter-of-fact. (Page 146) - Instead of "admitting a mistake," scouts will sometimes talk about “updating." That's a reference to __Bayesian updating__, a technical term from probability theory for the correct way to revise a probability after learning new information. (Page 146) - Tags: [[definition]] - Knowing that you're fallible doesn't magically prevent you from being wrong. But it does allow you to set expectations early and often, which can make it easier to accept when you *are* wrong. (Page 149) - Discovering you were wrong is an update, not a failure, and your worldview is a living document meant to be revised. (Page 149) #### Chapter 11: Lean In to Confusion - This chapter is about how to resist the urge to dismiss details that don't fit your theories, and instead, allow yourself to be confused and intrigued by then, to see them as puzzles to be solved… (Page 152) - If you want to become better at predicting people's behavior, then shrugging off the times when they violate your expectations is exactly the wrong response. (Page 156) - The instinct to judge other people's behavior as stupid, irrational, or crazy is very common, and it's also a sign that there's something you're missing....When their behavior confuses you, lean in to that confusion. Treat it as a clue. (Page 157) - I realized that I had underestimated how wildly different people’s internal experiences of social situations can be. That's changed the way I react in general when someone's behavior strikes me as rude, inconsiderate, or unreasonable. Whereas previously my train of thought would have stopped there, with my feeling irritated at them, I'm now more open to the possibility that we're simply perceiving the social situation differently, and __I get [[curious]] about how__. (Page 159) - All too often, we assume the only two possibilities are "I'm right" or "The other guy is right"—and since the latter seems absurd, we default to the former. But in many cases, there's an unknown unknown, a hidden "option C," that enriches our picture of the world in a way we wouldn't have been able to anticipate. (Page 161) - …more often than not, it's the accumulation of many puzzling observations over time that changes vour mind—a paradigm shift. (Page 161) - The rule for paradigm shifts in life is the same as it is in science. Acknowledge anomalies, even if you don't yet know how to explain them, and even if the old paradigm still seems correct overall. Maybe they'll add up to nothing in particular. Maybe they just mean that reality is messy. But maybe they're laying the groundwork for a big change of view. (Page 162) - In his book *Sources of Power*, decision researcher Gary Klein cites this as one of the top three causes of bad decisions. He calls it a "__[[de minimus error]]__," an attempt to minimize the inconsistency between observations and theory. (Page 165) - Tags: [[definition]] - If the decision-maker had been able to step back and see all the anomalies at once, it would have been clear to them that their paradigm was wrong. But because they were explaining away one single anomaly at a time, their confusion never got the chance to build up sufficiently. That doesn't mean you should go to the other extreme and abandon a paradigm as soon as you notice the slightest bit of conflicting evidence. What the best decision-makers do is look for ways to make sense of conflicting evidence under their existing theory but simultaneously take a mental note: *This evidence stretches my theory by a little (or a lot)*. If your theory gets stretched too many times, then you admit to yourself that you're no longer sure what's happening, and you consider alternate explanations. (Page 165) - It's a tricky skill. It forces you to act without clarity, to operate under one paradigm while being aware of its flaws and inconsistencies, knowing that it might be wrong and that you might end up abandoning it. You have to resist the temptation to resolve inconsistency prematurely by forcing all of your observations into one paradigm, and instead be willing to remain confused-for days, weeks, or even years. (Page 166) - Leaning in to confusion is about inverting the way you're used to seeing the world. Instead of dismissing observations that contradict your theories, get [[curious]] about them. Instead of writing people off as irrational when they don't behave the way you think they should, ask yourself why their behavior might be rational. Instead of trying to fit confusing observations into your preexisting theories, treat them as clues to a new theory. (Page 167) #### Chapter 12 Escape Your Echo Chamber - First of all, what kind of person is most likely to initiate a disagreement? A disagreeable person.… To give yourself the best chance of learning from disagreement, you should be listening to people who make it *easier* to be open to their arguments, not harder. People you like or respect, even if you don't agree with them. People with whom you have some common ground—intellectual premises, or a core value that you share—even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith. (Page 171) ##### Listen to People You Find Reasonable ##### Listen to People You Share Intellectual Coming Ground With - Catastrophic climate change is a __[[nondiversifiable risk]]__, Litterman said. That means there's nothing you can invest in that could hedge against the possibility of it happening. (Page 174) - Tags: [[definition]] ##### Listen to People Who Share Your Goals - Dissent isn't all that useful from people you don't respect or from people who don't even share enough common ground with you to agree that you're supposed to be on the same team. (Page 177) - We need to lower our expectations, by a lot. Even under ideal conditions in which everyone is well-informed, reasonable, and making a good-faith effort to explain their views and understand the other side, learning from disagreements is still hard (and conditions are almost never ideal). Here are three reasons why: (Page 178) ##### 1. We misunderstand each other's views - Even correct ideas often sound wrong when you first hear them. The thirty-second version of an explanation is inevitably simplistic, leaving out important clarifications and nuance. There's background context you're missing, words being used in different ways than you're used to, and more. (Page 179) ##### 2. Bad arguments inoculate us against good arguments - When we do encounter a good argument that's new to us, we often mistake it for a bad argument we're already familiar with. (Page 179) ##### 3. Our beliefs are interdependent—changing one requires changing others - …our beliefs are all interconnected, like a web….For Kevin to significantly update the belief…he'll have to also update a few of his associated beliefs…. (Page 180) ### Part V: Rethinking Identity #### Chapter 13: How Beliefs Become Identities - Even just discovering that someone disagrees with you about an identity-laden belief is like finding out they're on a rival team: "Oh, so you're one of *them*." (Page 187) - Agreeing with a belief isn't the same thing as identifying with it. (Page 187) - The science on identity is still evolving, but I've observed two things that turn a belief into an identity: Feeling embattled, and feeling proud. (Page 187) ##### Feeling Embattled - Being mocked, persecuted, or otherwise stigmatized for our beliefs makes us want to stand up for them all the more, and gives us a sense of solidarity with the other people standing with us. (Page 188) - It might seem like every issue must have a dominant majority and an embattled minority. But both sides of an issue can genuinely view their side as the embattled one. (Page 188) ##### Feeling Proud - Beliefs also become part of your identity when they come to represent some virtue you take pride in. (Page 189) - Feeling proud and feeling embattled often feed into each other. (Page 190) - Sometimes it's obvious when a belief has become an identity….But for every obvious case like that one, there are many subtler ones, beliefs that may not come with a label or official membership in a group, but that we nevertheless take personally. To notice them, be on the lookout for any of the following signs: (Page 192) ##### 1. Using the phrase "I believe" - That seemingly redundant phrase—shouldn't it go without saying that you believe your own statements?—signals that you're not simply describing the world, you're defining yourself. (Page 193) ##### 2. Getting annoyed when an ideology is criticized - When you feel the urge to step in and defend a group or belief system against perceived criticism, chances are good that your identity is involved. (Page 193) ##### 3. Defiant language - Proud, standing up, unapologetic, fearless—defiant language like this is a sign that you see yourself as an embattled minority viewpoint facing off against a society that is trying to silence, oppress, or shame you. (Page 194) ##### 4. A righteous tone ##### 5. Gatekeeping - When a label is more than just a practical description of your beliefs-when it feels like a status symbol or a source of pride—then the question of who else gets to wear that label actually matters. It becomes important to police the identity's boundaries. (Page 195) ##### 6. Schadenfreude - Deriving pleasure from news that humiliates some ideological group you disagree with is a sign of an "__[[oppositional identity]]__"—an identity defined by what it opposes. It's easy to overlook these because they often don't involve labels of their own, but they can distort your judgment all the same. If you love to hate hippies, techies, libertarians, fundamentalists, or any other ideological group, that gives you a motive to believe anything that seems to discredit their worldview. (Page 196) - Tags: [[definition]] ##### 7. Epithets - If you use epithets like these in talking about a particular issue, that's a sign you're viewing it as a fight between people, not ideas. (Page 196) ##### 8. Having to defend your view - The more you've argued a position to other people, especially in publie the more it's become linked to your ego and reputation, and the harder it is to abandon that position later. (Page 197) - The problem is compounded if you've had to defend your view against unfair or aggressive criticism. Now, changing your mind feels like letting the enemy win. (Page 197) - The problem with identity is that it wrecks your ability to think clearly. Identifying with a belief makes you feel like you have to be ready to defend it, which motivates you to focus your [[attention]] on collecting evidence in its favor. Identity makes you reflexively reject arauments that feel like attacks on you or the status of your group. It turns empirical questions such as "How large are the health benefits of breastfeeding?" into questions that are much more emotionally fraught and difficult to think clearly about: "Am I a good mother? Am I a good feminist? Will my friends judge me? Was 'my side' vindicated or humiliated?" And when a belief is part of your identity, it becomes far harder to change your mind, even when the facts change dramatically. (Page 197) - Note: Chapter summary #### Chapter 14: Hold Your Identity Lightly - "The more labels you have for yourself, the dumber they make you." Inspired in part by Graham's essay, I resolved to avoid identifying myself with any ideology, movement, or group. This plan of mine quickly ran into problems. (Page 199) - What you need to be able to do is keep those identities from colonizing your thoughts and values. I call this "holding your identity lightly." (Page 200) - Someone who holds her political identity lightly is happy when her norty wins an election. But she's happy because she expects her party to do a better job leading the country, not because the other side suffered a humiliating defeat. She's not tempted to taunt the losers…. (Page 201) - Holding an identity lightly means treating that identity as *contingent*…. (Page 201) - The __[[ideological Turing test]]__, suggested by economist Bryan Caplan, is based on similar logic. It's a way to determine if you really understand an ideology: Can you explain it *as a believer would*, convincingly enough that other people couldn't tell the difference between you and a genuine believer? (Page 203) - Tags: [[definition]] - The [[ideological Turing test]] is typically seen as a test of your knowledge: How thoroughly do you understand the other side's beliefs? But it also serves as an emotional test: Do you hold your identity lightly enough to be able to avoid caricaturing your ideological opponents? Even being willing to attempt the [[ideological Turing test]] at all is significant. People who hold their identity strongly often recoil at the idea of trying to "understand" a viewpoint they find disgustingly wrong or harmful. It feels like you're giving aid and comfort to the enemy. But if you want to have a shot at actually changing people's point of view rather than merely being disgusted at how wrong they are, understanding those views is a must. (Page 205) - Bottom line: it's hard to change someone's mind when you feel morally and intellectually superior to them. As Megan McArdle memorably put it: "It took me years of writing on the Internet to learn what is nearly an iron law of commentary: The better your message makes you feel about yourself, the less likely it is that you are convincing anyone else." (Page 206) - Reading sources that confirm your beliefs, trusting people whom you're close to-everyone does that. It's just an unfortunate fact that this universal tendency sometimes yields harmful results. (Page 208) - Acknowledging the weaknesses in your "side" can go a long way toward showing someone from the other side that you're not just a zealot parroting dogma, and that you might be worth listening to. (Page 208) - Usually, however, activists face trade-offs between identity and impact-and the more lightly you hold your identity, the more you can focus exclusively on actions with the highest impact. (Page 209) - You've probably known activists who spend the bulk of their energy fighting with other activists they're already 95 percent in agreement with over that remaining 5 percent sliver of disagreement. Sigmund Freud called it the "__[[narcissism of small differences]]__"—for the purposes of affirming your identity, the most tempting fight is often the one that helps distinguish you from your ideological neighbors. (Page 210) - Tags: [[definition]] - To be an effective activist you need to be able to perceive when it will be most impactful to cooperate, and when it will be most impactful to disrupt, on a case-by-case basis. (Page 213) #### Chapter 15: A Scout Identity - Blackmore had a second identity, one that was strong enough to counter the first—that of a truth-seeker. (Page 215) - That's a common theme among people who are good at facing hard truths, changing their mind, taking criticism, and listening to opposing views. [[scout mindset]] isn't a chore they carry out grudgingly; it's a deep personal value of theirs, something they take pride in. (Page 215) - Compare the following two things you could say to motivate yourself to get out of bed: 1. "I shouldn't break promises to myself." 2. "I'm the kind of person who follows through on their promises." The first statement frames the situation in terms of your obligations. The word *shouldn't* suggests a figurative parent or other authority figure, wagging their finger at you. If you get out of bed, it feels grudging, like you're forcing yourself to do something. By contrast, the second statement frames the situation in terms of your identity. Getting out of bed is now an affirmation of your values, proof that you're living up to the kind of person you want to be. (Page 217) - Note: Same guidance from habit-forming work. (E.g. *[[Atomic Habits]]*) - If you have a scout identity, that's how it feels when you realize you have to change your mind. It's not that it's easy; it still stings a little to realize that you made a mistake or that the person you've been arguing with actually has a point. But that slight sting is a reminder that you're living up to your standards, that you're becoming stronger. And so the sensation becomes pleasurable, in the same way sore muscles can be pleasurable for someone making progress toward getting in shape. (Page 218) - In chapter 3, we saw how our brains have a built-in bias for short-term rewards, and that this causes us to reflexively reach for [[soldier mindset]] more often than we should. Identities are a patch for that bug. They change the landscape of emotional incentives, allowing us to feel rewarded in the short term by choices that technically pay off only in the long term. (Page 218) - I've focused almost exclusively on what you as an individual can do to change your thinking, holding the world around you constant, because I wanted this book to be useful to you right away. But in the medium-to-long term, one of the biggest things you can do to change your thinking is to change the people you surround yourself with. We humans are social creatures, and our identities are shaped by our social circles, almost without our noticing. (Page 219) - You can make the effort to think honestly no matter what commu nity you're embedded in. But your friends, coworkers, and audience can either provide a headwind or a tailwind for your efforts. (Page 220) - If you strive to be a scout, it's true that you won't please everyone. However-as your parents may have told you growing up-that's impossible anyway. So you might as well aim to please the kind of people you'd most like to have around you, people who you respect and who motivate you to be a better version of yourself. (Page 222) - The people you read, follow, and talk to online help shape your identity just like the people in your "real-life" communities do. That can reinfoce [[soldier mindset]] if you spend your time with people who make you feel angry, defensive, and contemptuous. Or it can reinforce [[scout mindset]] if you spend your time in places like ChangeAView or r/FeMRADebates, You can even create a loosely knit "community" for yourself by making connections with people you see setting a good example of [[scout mindset]] online—bloggers, authors, or just random people on social media. (Page 224) - Personally, I find all those facets of [[scout mindset]] inspiring—the willingness to prioritize impact over identity; the confidence to be unconfident; the courage to face reality. But if I were to name one single facet I find most inspiring, it's the idea of being *[[intellectually honorable]]*: wanting the truth to win out, and putting that principle above your own ego. (Page 227) - Tags: [[definition]] ### Conclusion - Part of "understanding and working with what's real" is accepting the fact that [[soldier mindset]] is part of your wiring. That doesn't mean you can't change the way you think, of course. But it does mean you should be aiming to take *incremental* steps in the direction from soldier to scout rather than expecting yourself to be 100 percent scout overnight. (Page 230) - …consider making a plan for what those incremental steps toward [[scout mindset]] might look like for you. I recommend picking a small number of scout habits to start with, no more than two or three. Here's a list of ideas to choose from: 1. The next time you're making a decision, ask yourself what kind of bias could be affecting your judgment in that situation, and then do the relevant thought experiment (e.g., [[outsider test]], [[conformity test]], [[status quo bias test]]). 2. When you notice yourself making a claim with certainty ("There's no way..."), ask yourself how sure you really are. 3. The next time a worry pops into your head and you're tempted to rationalize it away, instead make a concrete plan for how you would deal with it if it came true. 4. Find an author, media outlet, or other opinion source who holds different views from you, but who has a better-than-average shot at changing your mind—someone you find reasonable or with whom you share some common ground. 5. The next time you notice someone else being "irrational," "crazy," or "rude," get [[curious]] about why their behavior might make sense to them. 6. Look for opportunities to update at least a little bit. Can you find a caveat or exception to one of your beliefs, or a bit of empirical evidence that should make you slightly less confident in your position? 7. Think back to a disagreement you had with someone in the past on which your perspective has since shifted and reach out to that person to let them know how you've updated. 8. Pick a belief you hold strongly and attempt an [[ideological Turing test]] of the other side. (Bonus points if you can actually find someone from the other side to judge your attempt.) Whichever other habits you choose to focus on, here's one more that should be on your list: Keep an eye out for examples of [[motivated reasoning]] in yourself—and when you spot one, *be proud of yourself for noticing*. Remember, [[motivated reasoning]] is universal; if you never notice it, that's probably not beeause you're immune. Becoming more aware of [[motivated reasoning]] is an essential step on the way to reducing it, and you should feel good about taking that step. (Page 230) - Tags: [[inbox]] - At the end of the day, we're a bunch of apes whose brains were optimized for defending ourselves and our tribes, not for doing unbiased evaluations of scientific evidence. So why get angry at humanity for not being uniformly great at something we didn't evolve to be great at? Wouldn't it make more sense to appreciate the ways in which we *do* transcend our genetic legacy? (Page 231)