↑ [[Rationality MOC]]
← [[Fake Beliefs]]
"Noticing Confusion" is the third section of Book I of *[The Sequences](https://www.lesswrong.com/s/7gRSERQZbqTuLX5re).* It focuses on the cognitive cues, the eponymous *confusion*, that signal when one is engaged with bias. The prevailing theme is that one's [strength as a rationalist](https://www.lesswrong.com/posts/5JDkW4MYXit2CquLs/your-strength-as-a-rationalist) is the "ability to be more confused by fiction than by reality." Determining between fiction and reality is a matter of looking for [[evidence]] and determining [what kind](https://www.readthesequences.com/Scientific-Evidence-Legal-Evidence-Rational-Evidence) and [how much](https://www.lesswrong.com/posts/nj8JKFoLSMEmD3RGp/how-much-evidence-does-it-take) is needed in order to confirm, refute, or develop a given claim. Bias distorts reality by requiring additional arguments to confirm inaccurate beliefs. It might be helpful to think of these as mental Rube Goldberg devices. With regard to evidence, the most prevalent form of confusion is [[10. Hindsight Devalues Science|hindsight bias]].
For the purposes of everyday life, I think it's important to focus on *rational* evidence, as opposed to scientific or legal evidence, which represent beliefs about a community (legal) or the universe (science) and thus require higher standards of review. **Evidence**, as a general rule, is any event [[evidence#^e87e5b|entangled]] with the observations leading to the inquiry at hand. Similarly, we might say that a **belief** is a set of *predictions* entangled with a set of *experiences*. In order for evidence to be considered rational, it must prohibit certain explanations or outcomes. The larger or more complex the claim, the more evidence is needed to support it. This could be done mathematically, assigning probability values to a set of outcomes given a belief statement, but I think it is enough to simply *take evidence seriously*. Electing not to do so doesn't prohibit the belief, but it does limit its accuracy, and ultimately requires greater effort to commit to the false belief.
For sake of making an argument, the relationship between belief and evidence is presented as occurring in a linear fashion:
>[!important] Belief statement → collection of evidence → analysis of evidence → confirmation (or rebuttal) of evidence.
But everyday experience suggests beliefs form as a result of consistent exposure to relevant evidence:
>[!important] Collection of observations → recognition of a pattern → formation of belief → analysis of evidence → confirmation of belief.
In this light, we might regard **hunches** and **intuition** as means of [[focusing uncertainty|focusing uncertainty]]. This lends itself to one of the criticisms of intelligent people: [[5. Einstein's Arrogance|intellectual arrogance]], which is misread confidence in one's conclusions. Here its important to note that a hunch is not a conclusion, it is a sense about an **intellectual direction**. The observational evidence *leads* to a subset of hypotheses within the entire set of possibilities. We might have several explanations, all varying in complexity. [[6. Occam's Razor|Occam's Razor]] directs us to prefer the simplest one, but utilizing such a principle does not prohibit an explanation from being irrational. An explanation also has to prohibit the others. It cannot be a simple statement that admits *all* the possibilities, and a spoken statement (eg "God did it") often obscures the complexity hidden within the meaning of certain words. A claim such as "God did it" ignores the intricacy of language. Words wrap themselves around ideas that require a certain amount of prior knowledge from the audience. If the prerequisites for knowledge aren't met, then the claimant must make up the [[inferential distance]] between themselves and their audience, otherwise they fall victim to the [[11. Illusion of Transparency#^36e5f7|illusion of transparency]].
Religion again takes the role of scapegoat in Yudkowski's argument. The manner in which women were investigated and tried for witchcraft during the early modern period demonstrates the improper application of rules governing evidence. In these cases, the conclusion of a [[Witches|witch]]'s guilt preceded the reasoning given to convict her, resulting in situations where *any* behavior was evidence of guilt. Such arguments defy the concept of [[conservation of expected evidence|conservation of expected evidence]], where high expectations of strong or weak evidence in support of a theory must be balanced by low expectations of the opposite. An additional example is the claim that God proves his existence by not making Himself known (miracles should thus disprove the existence of God).
While supernatural explanations may be easy to dismiss, if someone is committed to taking evidence seriously, it may be more helpful to try understanding the circumstances that lead a claimant to make preposterous statements, rather than deriding them for their beliefs. But what about a claim that falls within the parameters of observable reality? This is the real strength of being able to pinpoint confusion. It has been [argued](https://psycnet.apa.org/record/1993-44074-001) that belief is a prerequisite for comprehension and thus theoretically our minds are organized to initially accept an explanation. In a tribal setting, initial belief in new information likely led to the discovery of new resources, thus we have inherited a bias that requires more cognitive effort to disbelieve a statement.[^1]
[^1]: Note: This is important, we shouldn't resist a natural inclination. I think taking the next step is more a matter of training or social conditioning, in which we subvert our own inward cognitive authority for the external one provided for us. #saymore
This quirk of evolution however, doesn't guarantee that the claim fits the facts. Confusion, a niggling sense at the back of the brain, is the signal that a claim should be tested. Knowledge is not the ability to explain any outcome, it is result of [[Epistemology|attention and effort]] toward understanding a topic. Testing a belief is a matter of probabilities. Any evidence for a claim will have an equal expectation of evidence against that claim. Noticing confusion is merely a signal to evaluate how far we want to engage with the truth. We may not actually care enough about the claim to go any further than noting the feeling to ourselves. On the other hand, acknowledging the feeling may be imperative to our physical safety. Confusion is not an indication of a False statement, it is an indication of error along a true/false spectrum. The strength of one's response to the feeling of confusion should be appropriate to the situation.