↑ [[Rationality MOC]] ← [[Noticing Confusion Overview]] *Mysterious Answers to Mysterious Questions* is the fourth part of Book I of *The Sequences*. It reviews the basic tenets of contemporary rationalism and then goes into some of the subtleties in which bias arises in the quest for knowledge, particularly in the Sciences. Part of developing skillfulness in rational thinking is the ability to recognize **mysterious answers**, or responses that are as nebulous as the questions they purport to resolve. In appeals to religion or esoterica, these types of answers can be obvious: "God" or "magic" do not contribute new information to the observation of phenomena. But there are other cases in which established science can be used inappropriately as [[1. Fake Explanations|fake explanations]]. These are cases in which someone *performs* the role of explaining without actually offering any relevant information. As with other biases discussed in earlier sections, these types of explanations often exist on the social layer of inquiry, not within the inquiry itself. One such instance is [[guessing the password|guessing the password]], usually when an authority figure presents a question that has a set of rewards or punishment associated with it, such as a college placement test or a job interview. It is important to distinguish between situations asking for explanations and those asking for passwords. Passwords are arbitrary phrases that may or may not have factual relevance, but are connected to the social goals of the group at hand. This may be useful, but they are not relevant to gaining knowledge of the subject matter. While passwords imply a degree of intellectual gatekeeping between insiders and outsiders, insiders will use science as [[belief as attire|attire]] between each other. This is a subset of the type of reasoning discussed in [[belief as attire]], except it is using proven theory to explain an idea without fully understanding the idea itself. It is an irresponsible use of a valid idea. This type of fake explanation can especially insidious, as it can potentially misdirect anyone genuinely inquiring about the subject's validity. ==**IT WOULD BE GOOD TO HAVE AN EXAMPLE OF THIS**== Similarly, science can be used as a [[5. Semantic Stopsigns|semantic stopsign]]. A semantic stop sign is any word or phrase used to end the a discussion devolving into more complexity than the respondent is willing to engage. Instead of a password, an explanation is used as a [[15. Science as Curiosity-Stopper|curiosity-stopper]]. [[Semantic Signpost|God]] is a popular example, an appeal to authority that doesn't constrain possibilities in the conversation, but there are other, less overt examples such as "evolution," "Science," or "democracy." Generally, semantic stop signs are umbrella terms that contain the generally accepted norms of the group conversing. A Christian will use "God," people on Twitter will say, "Science." [[6. Mysterious Answers to Mysterious Questions|Mysterious answers]] don't offer new information, they wrap the question in a substance that defies explanation. They eliminate further investigation instead of encouraging it. Any explanation that does not produce further knowledge can be considered fake. Mysterious Answers do this by redefining the mystery as an actual feature of the question. Other types of mysterious answers are - [[7. The Futility of Emergence|"Emergence"]] - [[8. Say Not “Complexity”|"Complexity"]] - Entanglement - Ecological democracy It should be noted that the words themselves are not problematic, it is their usage. In all cases, they make the mistake of attempting to provide an explanation without actually thinking about the problem. All mysterious answers possess a quality of [[11. My Wild and Reckless Youth|"wondrous impenetrability"]]. >[!quote] Eliezer Yudkowski >Fake explanations don't feel fake. That's what makes them dangerous. Fake explanations *feel* real because they are based on loosely *connected* but not necessarily *causal* evidence. So it is possible to mistake such evidence as [[4. Fake Causality]]. In the 17th & 18th c., [phlogiston](https://en.wikipedia.org/wiki/Phlogiston_theory) (now a popular rationalist meme) was the term for the mysterious substance inside flammable matter. As it turns out, this explanation *generally fits* the observable chemical transformation of a combustible material when exposed to a heat source. Rapid oxidation (fire) occurring at the surface of a fuel does *appear* as if a substance is being drawn out of the material. It is during the follow-up questions that phlogiston theory loses its efficacy: the explanation for an extinguished flame in a covered container was also phlogiston. Once it was released from the fuel, it smothered the flame. This explanation is also partly correct, or loosely connected to the cause, but isn't the cause itself. The error occurs in **double-counting** the phlogiston as evidence of fire, when one ought to be using the *presence* of phlogiston to predict fire. [[9. Positive Bias|Positive bias]] is one contributor to the formation of mysterious answers. This is the tendency to test for positive results when presented with a pattern. In fact, it is more efficient to test for negative results. This type of tendency is usually innate, part of developing skillfulness in thought is recognizing these types of subconscious behaviors. Another skill is learning to apply reasoning [[10. Lawful Uncertainty|according]] to the probability set, instead of relying on the randomness within it. This is a response to the [gambler's fallacy](https://en.wikipedia.org/wiki/Gambler%27s_fallacy), the belief that the probability of an outcome changes based on what has occurred in the past. If one is awarded a dollar for correctly guessing "Heads" in a coin toss and not penalized for "Tails," it is always beneficial to guess "Heads" for each toss because it will pay out 1/2 the value of the number of tosses over time (assuming the coin is fair). It requires accepting a certain degree of [[11. My Wild and Reckless Youth|imprecision]] in one's reasoning, and focusing on how to create more precise circumstances through a series of iterative failures. Here, the term *rationalist* comes into conflict with it's [traditional](https://en.wikipedia.org/wiki/Rationalism) meaning where the validity of a claim is achieved through deductive reasoning. Yudkowski believes [[12. Failing to Learn from History|experience]] is essential to understanding Mysterious Questions, not just in terms of observation, but undergoing the emotional transformation that occurs with clarity of thought. Part of this is understanding our [[13. Making History Available|evolutionary context]]. This helps us develop compassion for those who appeal to Mysterious Answers, we can see how a Mysterious Answer seems reasonable given someone's circumstances. We can help them come to better conclusions. We have innate [[14. Explain-Worship-Ignore|modes of inquiry]] with which we pursue phenomena, we explain it, we worship it, or we ignore it. It is possible that the mechanisms of irrationalism are a means of conserving energy. The pursuit of Knowledge is exhausting. When we choose to explain something, the end result is always a new question. This is the reality of the journey. We resolve one mystery only to encounter a new one. Curiosity is the intellectual motivation to continue pursuing these new mysteries as they arise. Children are excellent at this. Unfortunately, their curiosity does not conveniently fit into the rigorously scheduled lives of their care givers. And one tragedy of contemporary life is the use of [[15. Science as Curiosity-Stopper]]. These are explanations that are technically true, but do not serve an explanatory function: > Q: "Why is the sky blue?" > A: "Physics!" Even knowledge as a function of rote learning is insufficient for true understanding. Memorizing arithmetic tables does not provide the comprehension that ten sticks can be divided into groups of six and four. Information must be [[16. Truly Part Of You|internalized]] in order to become Knowledge. Records are useful for searching for possible answers, but each must be tested by the rationalist. Developing the cognitive fitness to engage in the Pursuit is the goal of the rationalist. It is what determines the "[difference between knowing the name of something and knowing something](https://en.wikipedia.org/wiki/What_Do_You_Care_What_Other_People_Think%3F)." ### The health industry’s invisible hand is a fist title: The health industry’s invisible hand is a fist author: Cory Doctorow publisher: Pluralistic published date: 2024-06-13 An example of market efficiency as a mysterious answer. [full text](https://pluralistic.net/2024/06/13/a-punch-in-the-guts/) #### Highlights * Capitalism's most dogmatic zealots have a mystical belief in the power of markets to "efficiently allocate" goods and services. For them, the process by which goods and services are offered and purchased performs a kind of vast, distributed computation that "discovers the price" of everything. Our decisions to accept or refuse prices are the data that feeds this distributed computer, and the signals these decisions send about our desires triggers investment decisions by sellers, which guides the whole system to "equilibrium" in which we are all better off. There's some truth to this: when demand for something exceeds the supply, prices tend to go up. These higher prices tempt new sellers into the market, until demand is met and prices fall and production is stabilized at the level that meets demand. But this elegant, self-regulating system rarely survives contact with reality. It's the kind of simplified model that works when we're hypothesizing about perfectly spherical cows of uniform density on a frictionless surface, but ceases to be useful when it encounters a messy world of imperfect rationality, imperfect information, monopolization, regulatory capture, and other unavoidable properties of reality. For members of the "efficient market" cult, reality's stubborn refusal to behave the way it does in their thought experiments is a personal affront. Panged by cognitive dissonance, the cult members insist that any market failures in the real world are illusions caused by not doing capitalism hard enough. When deregulation and markets fail, the answer is always more deregulation and more markets. That's the story of the American health industry in a nutshell. Rather than accepting that people won't shop for the best emergency room while unconscious in an ambulance, or that the "clearing price" of "not dying of cancer" is "infinity," the cult insists that America's worst-in-class, most expensive health system just needs more capitalism to turn it into a world leader.