In [[information theory]], the concepts **mutually exclusive and exhaustive** relate to how we categorize or partition a set of possible outcomes, particularly in probability distributions and decision-making. **Mutually Exclusive**: a set of events or categories is mutually exclusive if no two events can occur at the same time. That is, if one event happens, all others must be ruled out. In probability terms, $P(A \cap B) = 0$ for two mutually exclusive events $A$ and $B$. **Collectively Exhaustive**: a set of events is collectively exhaustive if they cover all possible outcomes; a set of categories is exhaustive if it covers the entire concept space. In probability terms, $P(A) + P(B) + P(C) + \dots = 1$. When designing efficient encoding schemes (e.g., [[Huffman codes]]), ensuring that all possible symbols are both mutually exclusive (so they don’t overlap) and exhaustive (so every input is encoded) is critical. In [[decision trees]] or [[Bayesian classifiers]], we aim for mutually exclusive decision rules that together cover all cases (i.e., are exhaustive) to make classification robust. In some systems, such as a library system, it is not meaningful for all books to have one topic, and so cross-referencing is necessary.