The great power to be use computationally responsibly was firmly indoctrinated in many of us early nerds via the seminal textbook by computer science pioneers Hal Abelson, Gerald Sussman, and Julie Sussman entitled “Structure and Interpretation of Computer Programs.” This was the textbook for MIT’s introductory course (well-known for not teaching you any of the most professionally-useful languages). We all kind of begrudgingly took the course because it wasn’t going to get us the jobs that we really wanted, but no one I know who took the course ever regretted it in later life. Besides teaching us about the evils of GOTO and taking us down the strange path of recursion, Abelson and Sussman and Sussman aspired to teach the ideas behind computation and what bits and bytes could represent and mean at a higher level, instead of just the applications.

One idea that stayed with me is the concept of a kind of variable that could not just hold one number, but all the numbers in the universe. So on the one hand you can say,

X = 3

which means that X has the number three inside it. You can easily change X to a different number like 4.

X = 4

but that day in 1985, the MIT professors showed us how to put all of the numbers inside X like a magician (or wizard) might do with their top hat,

X = <all of the numbers>

and as if with a touch of the wand he declared a new variable to hold all of the even numbers in the world as simply,

Y = 2 * X

which you can read as multiplying 2 by all the numbers in the word. And then the subsequent elegance of declaring that to make all the odd numbers in the world simply add 1

Z = Y + 1

to sit at the computer console and hold onto X, Y, and Z — each of them like a little marble holding within themselves infinities of numbers. I can’t describe the feeling as anything other than magic. To hold something so LARGE—infinite, actually—in something so small felt impossible until that moment.

Related code in Scheme