↑ [[How Computers Work]]
**Binary code** is the means of representing text and data using a two-symbol system, typically either `1` or `0`, and is a fundamental concept in computer science. Its premise was introduced mathematically by Gottfried Leibniz in 1689, but has existed in principle in many cultures prior to its development in the sciences. Leibniz’s system was further developed by George Boole in *['The Mathematical Analysis of Logic'](https://www.gutenberg.org/ebooks/36884)* as a system composed of `AND`, `OR`, and `NOT` rules, which Claude Shannon adapted and applied to electrical circuits in his thesis, *[A Symbolic Analysis of Relay and Switching Circuits](https://dspace.mit.edu/bitstream/handle/1721.1/11173/34541425-MIT.pdf?sequence=1&isAllowed=y)*.
In most modern computing systems, data is encoded as [[Bit]] strings, where each symbol represents a sequence of `1` or `0`, read by a CPU as a series of discrete electrical signals and transmitted to the user as the encoded symbol.
Historical examples of binary systems include the *I Ching*, which uses the duality of *yin* and *yang*, and the [Ifá](https://en.wikipedia.org/wiki/If%C3%A1?wprov=sfti1) system of divination developed in West Africa. Other occurrences of binary code signaling can be found in the use of African slit drums for long distance communication, the smoke signaling systems of North American indigenous peoples, and Morse code.