#causation #arrow #data #inference
Causation is a relational concept, referring to the connection between cause and effect, where one event or action (the [[causality]]) leads to a specific outcome or consequence (the effect). This relationship can be encoded as directed relations, or arrows, in category theory, symbolizing the directional flow from cause to effect. This approach aligns with the Chinese concept of [[Tao]] ([[道]]), which represents the fundamental principle underlying the natural order and the interconnectedness of all things. Just as Tao denotes the path or way things unfold, causation, depicted as arrows in category theory, illustrates the directional and relational nature of events unfolding over [[time]].
# An Hourglass generated by DALL-E
![[DALLE_CAUSALCONE.png]]
Causation involves the idea that certain events or conditions are responsible for bringing about certain results. Causation implies a direct and artificial connection between the cause and effect, suggesting that the presence or occurrence of the cause will result in a predictable effect. Understanding causation is crucial in various fields such as science, philosophy, law, and social sciences as it helps explain how things happen and allows for predicting and controlling outcomes. Under many circumstances, especially in scientific or engineering literature, causation is studied in terms of [[dynamical system]].
## Speed of Causation
The [[light speed|speed of light]] is related to the [[Electric permittivity|electric permittivity]] of free space ($\epsilon_0$) and [[Magnetic permeability|magnetic permeability]] of free space ($\mu_0$). The relationship between these three quantities is given by the equation:
$c = \frac{1}{\sqrt{\epsilon_0 \mu_0}}$
This relationship is derived from [[Maxwell’s equations]], which describe the behavior of electromagnetic waves. Permittivity and permeability are properties of a medium that describe its response to electric and magnetic fields, respectively. Permittivity (ε) is a measure of how easily electric fields can propagate through a material, while permeability (μ) relates to the ease with which magnetic fields can propagate. These properties are relevant in the context of electromagnetic waves, such as light, which can travel at a certain speed in a given medium depending on the values of permittivity and permeability.
However, the speed of causation, as commonly understood, refers to the speed at which cause and effect propagate or influence each other in the universe. It pertains to how quickly information or influence can travel from one point to another and is not considered to be dependent on a particular medium's permittivity or permeability.
In most physical theories, including classical physics and the [[Theory of Relativity|theory of relativity]], the speed of causation is considered to be equal to the speed of light in a [[vacuum]], denoted by the symbol "c." According to the theory of relativity, this speed sets an upper limit on the transmission of information or causal influences through space. It is a fundamental constant of nature and plays a crucial role in our understanding of the universe.
While permeability and permittivity are properties related to the propagation of electromagnetic fields in a medium, they are not directly related to the speed of causation in the sense that they are dependent on the medium of choice, while causation is more abstract, and can be transmitted through many kinds of medium or combination of mediums. The speed of causation is typically considered to be the speed of light, which is calculated from permittivity and permeability.
## Causation and Entropy
Tell me about the relations between [[Entropy|entropy]] and causation
Entropy and causation are concepts from different fields, but they are related in certain ways.
[[Entropy]] is a concept from thermodynamics and statistical mechanics that measures the disorder or randomness in a system. It quantifies the number of possible microscopic configurations that a system can have given its macroscopic properties. In simple terms, entropy represents the degree of uncertainty or lack of information about the state of a system.
Causation, on the other hand, is a concept from philosophy and science that deals with the relationship between cause and effect. It explores how one event or phenomenon leads to another, establishing a cause-and-effect relationship between them.
The relation between entropy and causation can be understood through the concept of determinism. Determinism is the philosophical position that every event has a cause, and given the initial conditions of a system and knowledge of its laws, it is possible to predict its future state with certainty.
In this context, entropy has been related to causation in two main ways:
1. Causal Relevance: Entropy can be seen as an indicator of causal relevance or information flow between variables. In complex systems, high entropy indicates that there is little information or predictability about one variable based on another, suggesting weak causal connections. Conversely, low entropy suggests strong causal relevance where one variable provides meaningful information about another.
2. Arrow of Time: Entropy has also been associated with the arrow of time or the directionality of causation. According to the second law of thermodynamics, entropy tends to increase over time in isolated systems. This increase in entropy gives rise to irreversibility and asymmetry in physical processes. Therefore, entropy can be seen as a measure of temporal asymmetry or directionality in causation.
It's important to note that while there are connections between entropy and causation, they operate at different levels and have distinct meanings within their respective fields. Entropy deals with physical systems' properties and information content, while causation explores the relationship between events and their causes.
One way to think about the relationship between entropy and causation is that entropy is a measure of the amount of uncertainty or disorder in a system. Causation, on the other hand, is a relationship between two [[event|events]] where one event (the cause) brings about the other event (the effect). In this view, entropy can be seen as a measure of the degree to which a system is causally determined. A system with low entropy is a system that is highly causally determined, while a system with high entropy is a system that is less causally determined.
Another way to think about the relationship between entropy and causation is that entropy is a measure of the amount of information in a system. Causation, on the other hand, is a relationship between two events where the cause contains information about the effect. In this view, entropy can be seen as a measure of the degree to which a system is informative. A system with low entropy is a system that is highly informative, while a system with high entropy is a system that is less informative.
Finally, some researchers have argued that entropy and causation are two sides of the same coin. In this view, entropy is a measure of the amount of information that is unavailable, while causation is a measure of the amount of information that is available. In other words, entropy is a measure of the uncertainty about the future, while causation is a measure of our knowledge about the future.
The relationship between entropy and causation is a complex and fascinating topic. There is no single answer that captures all of the nuances of this relationship. However, the different perspectives on this relationship provide us with a deeper understanding of both entropy and causation.
Here are some specific examples of how entropy and causation have been related in recent research:
- **Transfer entropy:** Transfer entropy is a measure of the amount of information that is transferred from one system to another. It has been shown that transfer entropy can be used to infer causal relationships between systems.
- **Directed information:** Directed information is a measure of the amount of information that is transferred from one system to another in a specific direction. It has also been shown that directed information can be used to infer causal relationships between systems.
- **Causal entropy:** Causal entropy is a measure of the amount of uncertainty about the future that is due to causal relationships. It has been shown that causal entropy can be used to quantify the strength of causal relationships.
These are just a few examples of how entropy and causation have been related in recent research. This is an active area of research, and there is still much that we do not understand about the relationship between these two concepts. However, the research that has been done so far suggests that entropy and causation are closely related, and that they can be used to better understand the world around us.
## Judea Pearl's view on causation
In the literature of [[Bayesian Belief Network]], or [[Belief Propagation]], causation refers to the relationship between [[causation|cause and effect]], specifically focusing on the question of how causes lead to effects in a systematic manner. Judea Pearl is a prominent computer scientist and philosopher known for his work in causal inference and the development of causal graphical models.
According to Pearl's causal framework, causation is distinct from mere correlation or statistical association. While correlation captures the statistical relationship between variables, causation delves into understanding the underlying mechanisms and dependencies that give rise to the observed associations.
Pearl emphasizes the importance of [[Counterfactual|counterfactuals]] in understanding causation. Counterfactuals refer to hypothetical statements about what would have happened if certain conditions were different. They enable us to reason about the causal effects of interventions or changes in variables.
Pearl's framework relies on the notion of causal models, which involve the use of causal graphs or [[Directed acyclic graph|directed acyclic graphs]] ([[Directed acyclic graph|DAGs]]) to represent causal relationships between variables. In these graphs, variables are represented as nodes and directed edges indicate causal dependencies or influences.
Causal inference in Pearl's framework involves using [[Data|data]] and causal models to make inferences about causal relationships. It includes techniques such as do-calculus, which enables researchers to compute the effects of interventions or to estimate causal effects from observational data using specific assumptions and identification conditions.
Pearl's work on causation has contributed to the development of a rigorous and systematic approach to understanding cause-and-effect relationships, providing a foundation for causal inference and reasoning. His framework has been influential in various fields, including statistics, machine learning, epidemiology, and social sciences, where understanding and identifying causal relationships are crucial for making informed decisions and understanding complex systems.
It is important to note that Pearl's framework on causation is just one perspective among various philosophical and scientific discussions on the nature and interpretation of causation.
# Geoffrey Hinton's definition of thought
Starting [16:38](https://youtu.be/zl99IZvW7rE?si=h7xUHQd4oI0tLjr2&t=996)of the video [[@GeoffreyHintonFoundations2018#What thinking is!]], [[Geoffrey Hinton]] stated:
<blockquote>Thought is the activity pattern that cause things to happen!</blockquote>
This astute assertion elucidates that any activity pattern could have contributed to the formation of a causal relationship. A causal relationship emerges when the outcome is associated with the activity pattern, which is the crucial point. While there may not be a causal relationship between the observed outcomes and the pattern, the concept of a causal relationship nonetheless holds true.
# References
```dataview
Table title as Title, authors as Authors
where contains(subject, "Causation") or contains(subject, "Causality")
```