#BookStudies
This is a book written by a Disney Fellow about fundamental computer science. Very intrigued.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/The Pattern on The Stone - Concepts to Remember]]
### TPotS - Pg. VII - A Wizard's Computation
> I etch a pattern of geometric shapes onto a stone. To the uninitiated, the shapes look mysterious and complex, but I know that when arranged correctly, they will give the stone a special power, enabling it to respond to incantations in a language no human being has ever spoken. I will ask the stone questions in this language, and it will answer by showing me a vision: a world created by my spell, a world imagined within the pattern on the stone.
*This is the world I am trying to create for myself, the world of [[Areas/Thoughts/Worldbuilding/Fiat]]. If I can assist in the acceleration of the true creation process, I will be pleased with my life's work.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. VII - A Wizard's Computation]]
### TPotS - Pg. VIII - Computers Transcend Technology
> Moreover, the ideas have almost nothing to do with the electronics out of which computers are built. Present-day computers are built of transistors and wires, but they could just as well be built, according to the same principles, from valves and water pipes, or from sticks and strings. The principles are the essence of what make a computer compute. One of the most remarkable things about computers is that their essential nature transcends technology. That nature is what this book is about.
*The computing that Hillis talks about is more akin to a mystic, underlying theory of logic as the language of the universe. This plays into [[The Shell]] very nicely. I need to find a way to back-trace links, I am referring to the shell far too often not to.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. VIII - Computers Transcend Technology]]
### TPotS - Pg. IX - The First Principle of the Nature of Computing - Functional Abstraction
> A few general themes underlie an exposition of the nature of computers: the first is the principle of functional abstraction, which leads to the aforementioned hierarchy of causes and effects. The structure of the computer is an example of the application of this principle - over and over again, at many levels. Computers are understandable because you can focus on what is happening at one level of the hierarchy without worrying about the details of what goes on at the lower levels. Functional abstraction is what decouples the ideas from the technology.
>
*This is the concept I am trying to grasp in [[Resting Emergent Systems, the Potential for AI's Future]]. I don't know if I'm fully qualified to talk about it in total, but this is definitely the concept I'm trying to hit on. We exist on layer upon layer of [[Functional Abstraction]].
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. IX - The First Principle of the Nature of Computing - Functional Abstraction]]
### TPotS - Pg. IX - The Second Principle of the Nature of Computing - The Universal Computer
> The second unifying theme is the principle of the *universal computer* - the idea that there is really only one kind of computer, or more precisely, that all kinds of computers are alike in what they can and cannot do. As near as we can tell,any computing device, whether it's built of transistors, sticks and strings, or neurons, can be simulated by a universal computer. This is a remarkable hypothesis: as I will explain, it suggests that making a computer think like a brain is just a matter of programming it correctly.
*This has everything to do with [[The Church-Turing Thesis]] that I am trying to comprehend. Once I have a better understanding of the thesis, I will probably return to this book. It also has to do with [[The Shell]] in that it demonstrates the ability for there to be an underlying "form" of a computer.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. IX - The Second Principle of the Nature of Computing - The Universal Computer]]
### TPotS - Pg. IX The Third Principle of the Nature of Computing - Emergence
> The third theme in this book, which won't be fullt addressed until the last chapter, is in some sense the antithesis of the first. There may be an entirely new way of designing and programming computers - a way not based on the standard methods of engineering. This would be exciting, because the way we normally design computers lead ultimately to a certain fragility and inefficiency. This weakness has nothing to do with any fundamental limitations of information-processing machines - it's a limitation of the hierarchical method of design. But what if instead we were to use a design process analogous to biological evolution - that is, a process in which the behaviors of the system *emerge* from the accumulation of many simple interactions, without any "top-down" control?
*I love that he establishes [[Emergence]] as a fundamental principle of computing. This has everything to do with the direction I'm trying to point my life. I want to be able to create emergent layers for myself.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. IX - The Third Principle of the Nature of Computing - Emergence]]
### TPotS - Pg. XI - The Imagination Machine
> The computer is not just an advanced calculator or camera or paintbrush; rather, it is a device that accelerates and extends our processes of thought. It is an imagination machine, which starts with the ideas we put into it and takes them farther than we ever could have taken them on our own.
*What a beautiful concept. An emergence engine.
### TPotS - Pg. 3 - Logic to Switch Conversion
> Shannon was interested in building a machine that could play chess - and more generally in building mechanisms that imitated thought. In 1940, he published his master's thesis, which was titled "A Symbolic Analysis of Relay Switching Circuits." In it, he showed that it was possible to build electrical circuits equivalent to an expression in Boolean algebra. In Shannon's circuits, switches that were open or closed corresponded to logical variables of Boolean algebra that were true or false. Shannon demonstrated a way of converting any expression in Boolean algebra into an arrangement of switches.
*All logic can be manifested in the physical world. Now, I have to figure out how that is.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 3 - Logic to Switch Conversion]]
### TPotS - Pg. 16 - Switches and Connectors
> Except for the miracle of reduction, there is no special reason to build computers with silicon technology. Building a computer out of any technology requires a large supply of only two kinds of elements: *switches* and *connectors*. The switch is a steering element (the hydraulic valve, or the transistor), which can combine multiple signals into a single signal. Ideally, the switch should be asymmetrical, so that the input signal affects the output signal but not vice versa, and it should have a restoring quality, so that a weak or degraded input signal will not result in a degraded output. The second element, the connector, is the wire or pipe that carries a signal between switches. This connecting element must have the ability to branch, so that a single output can feed many inputs. These are the only two elements necessary to build a computer. Later we will introduce one more element - a register, for storing information - but this can be constructed of the same steering and connecting components.
*We're all waiting for the innovation from silicon so that reduction can proceed. I believe light is the answer, but I can't tell you how yet. Potentially a computer made of mirrors? A computer made of the cracks between atoms.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 16 - Switches and Connectors]]
### TPotS - Pg. 18 - Functional Abstraction
> Naming the two signals in computer logic 0 and 1 is an example of functional abstraction, It lets us manipulate information without worrying about the details of its underlying representation. Once we figure out how to accomplish a given function, we can put the mechanism, inside a "black box," or a "building block" and stop thinking about it. The function embodied by the building block can be used over and over, without reference to the details of what's inside.
> This hierarchical structure of abstraction is our most powerful tool in understanding complex systems, because it lets us focus on a single aspect of a problem at a time.
> For most purposes, we can forget about technology. This is wonderful, because it means that almost everything we say about computers will be true even when transistors and silicon chips become obsolete.
*Remember this for [[Resting Emergent Systems, the Potential for AI's Future]]. Also for [[Functional Abstraction]]
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 18 - Functional Abstraction]]
### TPotS - Pg. 39 - The Magic of a Computer
> The magic of a computer lies in its ability to become almost anything you can imagine, as long as you can explain exactly what that is. The hitch is in explaining what you want.
*You have to know your goals before you can actualize them.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 39 - The Magic of a Computer]]
### TPotS - Pg. 40 - To paraphrase Mark Twain; Translation of the Inexpressible
> To paraphrase Mark Twain, the difference between the right program and the almost-right program is like the difference between lighting and a lightning bug - the difference is just a bug.
> A skilled programmer is like a poet who can put into words those ideas that others find inexpressible. If you are a poet, you assume a certain amount of shared knowledge and experience that the programmer and the computer have in common is the meaning of the programming language.
*This is the difference between those with control of the universe and those who live in it.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 40 - To paraphrase Mark Twain; Translation of the Inexpressible]]
### TPotS - Pg. 58 - Welcome to the Hierarchy of Functional Abstractions
> We are now in a position to summarize how a computer works, but *remember that it is not important to remember how every step works!* The important thing to remember is the hierarchy of functional abstractions.
> The work performed by the computer is specified by a *program*, which is written in a *programming language*. This language is converted to sequences of *machine-language* instructions by *interpreters* or *compilers*, via a predefined set of subroutines called the *operating system*. The instructions, which are stored in the *memory* of the computer, define the operations to be performed on data, which are also stored in the computer's memory. A *finite-state machine* fetches and executes these instructions. The instructions as well as the data are represented by patterns of *bits*. Both the finite-state machine and the memory are built of storage *registers* and *Boolean logic blocks*, and the latter are based on simple *logical functions,* such as *And, Or, and Invert.* These logical functions are implemented by *switches*, which are set up either *in series* or *in parallel*, and these switches control a physical substance, such as water or electricity, which is used to send one of two possible signals from one switch to another: 1 or 0. This is the hierarchy of abstraction that makes computers work.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 58 - Welcome to the Hierarchy of Functional Abstractions]]
### TPotS - Pg. 61 - The Universal Computer
> The central idea in the theory of computation is that of a *universal computer* - that is, a computer powerful enough to simulate any other computing device. The general-purpose computer described in the preceding chapters is an example of a universal computer; in fact, most computers we encounter in everyday life are universal computers.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 61 - The Universal Computer]]
### TPotS - Pg. 69 - Church-Turing Thesis
> The class of problems that are computable by a digital computer apparently includes every problem that is computable by any kind of device.
[[The Church-Turing Thesis]]
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 69 - Church-Turing Thesis]]
### TPotS - Pg. 70 - Goedel's Theorem
> Goedel's theorem states that within any self-contained mathematical system powerful enough to express arithmetic, there exist statements that can neither be proved true nor false.
*Interesting how this compares to the current state of AGI and their inability to do math.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 70 - Goedel's Theorem]]
### TPotS - Pg. 73 - Spooky Action At a Distance
> Recently, there have been some intriguing hints that we may be able to build a quantum computer that takes advantage of a phenomenon known as *entanglement*. In a quantum mechanical system, when two particles interact, their fates can become linked in a way utterly unlike anything we see in the classical physical world: when we measure some characteristic of one of them, it affects what we measure in the other, even if the particles are physically separated. Einstein called this effect, which involves no time delay, "spooky action at a distance," and he was famously unhappy with the notion that the world could work that way.
*This is just sympathy with extra steps. Just kidding.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 73 - Spooky Action At a Distance]]
### TPotS - Pg. 76 - Kinship with a Turing Machine
> To me, life and thought are both made all the more wonderful by the realization that they emerge from simple, understandable parts. I do not feel diminished by my kinship to Turing's machine.
*Just a solid quote.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 76 - Kinship with a Turing Machine]]
### TPotS - Pg. 78 - Origin of the Algorithm and Magic Spells
> With or without socks, an *algorithm* is a fail-safe procedure, guaranteed to achieve a specific goal. The word "algorithm" comes from the name of the Arabian mathematician al-Khwarizmi, who wrote down an extensive collection of algorithms in the ninth century. The word "algebra," in fact, comes from *al jabr* ("the transposition"), a term in the title of one of his books. Many of (his) algorithms are still used today. He described them, of course, in Arabic, which may be why this languagee gained a reputation as the language of magic spells.
*Surprised I didn't learn this in [[Resources/Readwise/Books/Once Upon An Algorithm - Martin Erwing]].
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 78 - Origin of the Algorithm and Magic Spells]]
### TPotS - Pg. 81 - MergeSort
> There is an even more elegant recursive algorithm, which doesn't require the cards to be sequentially numbered; it would be useful for putting a large number of business cards into alphabetical order, for example. This algorithm, called *merge sort*, is harder to understand, but it's so beautiful that I cannot resist describing it. The merge-sort algorithm depends on the fact that it's easy to merge two already sorted stacks; this merge procedure is a subroutine of the algorithm, and the algorithm works like this: If your stack consists of only one card, then that card is already sorted. Otherwise, divide the stack in half, and recursively use the merge-sort algorithm by sorting each half and then combining them using the merge procedure as described above. That's all there is to it.
*[[Areas/Reading/Books_Bk/Books/Once Upon An Algorithm/Pg. 114 - Mergesort as an Optimal Algorithm]] - Once Upon an Algorithm
*Glad to get more understanding of this concept, since it seems to be agreed upon that it truly is optimal.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 81 - MergeSort]]
### TPotS - Pg. 83 - Understanding Heuristics
> An algorithm, by definition, is guaranteed to get the job accomplished, but this guarantee of success often comes at too high a price. In many cases, it is more practical to use a procedure that only *almost always* gets the right answer. Often, "almost always" is good enough. A rule that tends to give the right answer, but is not guaranteed to, is called a heuristic. It is often more practical to use a heuristic than an algorithm
*This makes me think of NASA only using n digits of pi to calculate. Let's come back to this when we transcribe [[The Big Bang of Numbers - Suri]].*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 83 - Understanding Heuristics]]
### TPotS - Pg. 84 - Limitations of Algorithms
> Philosophers have written a great deal of nonsense about "the limitations of computers" when what they are really talking about are the limitations of algorithms.
Another great quote from Hillis. Computers are only limited by Logos.
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 84 - Limitations of Algorithms]]
### TPotS - Pg. 127 - A Definitive Neural Network
> A neural network is a simulated network of artificial neurons. This simulation may be performed on any kind of computer, but because the artificial neurons can operate concurrently, a parallel computer is the most natural place to execute it. Each artificial neuron has one output and a large number of inputs, perhaps hundreds or thousands. In the most common type of neural network, the signals between the neurons are binary - that is, either 1 or 0. The output of one neuron can be connected to the inputs of many others. Each input has a number associated with it, called its weight, which determines how much of an effect the input has upon the neuron's single output.
*I'd been trying to define weights and this was the best summary I had seen.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 127 - A Definitive Neural Network]]
### TPotS - Pg. 153 - Closing Statement
> Between the signals of our neurons and the sensations of our thoughts lies a gap so great that it may never be bridged by human understanding. So when I say that the brain is a machine, it is meant not as an insult to the mind but as an acknowledgment of the potentiall of a machine. I do not believe that a human mind is less than we imagine it to be, but rather that a machine can be much, much more.
*A fantastic way to close out a wonderful book.*
[[Areas/Reading/Books/Computer Science/The Pattern On The Stone - Daniel Hillis/TPotS - Pg. 153 - Closing Statement]]