I recently finished George Dyson 's Turing's Cathedral after picking it up on a whim. Well, actually, 'on a whim' is not fair, I read about it on edge.org a few years ago, and also heard about it through the grapevine (or what goes for grapevine online). It's just that there is a stack of other books I was to read first. But what are reading lists for if not to be ignored? I saw the book in a shop in Bern, bought it and read it.
Actions I do not regret, although the book did not capture my attention from the first chapter.
Seemingly, Dyson sets out to tell the history of the first electronic computers and the men and women who were crucial to their invention. Parts of the story - that of Turing, the great man who's name graces the title of the book - I as any other computer scientist, have heard before. Dyson however focus on America and physical computers (at least as a framework), and there the center stage is taken up by the other well known name in computer history: John von Neumann. At his side are a group of (even) more or less famous scientists and engineers working on those first machines. Each getting their time in the spotlight. Most of the researchers were associated with the Institute for Advanced Study in Princeton. One name stands out, I vaguely recognize it - Julian Bigelow - but I must admit I was ignorant of the incredibly important role this engineer and computer scientist has played in the history of computing. An ignorance, that I fear I am not alone in, but that I hope Turing's Cathedral is now curing one reader at a time once and for all.
This first part of Turing's Cathedral is somewhat slow, and it took me a while to get into the book. I had expected a more or less ordered sequence of computer history. Instead Dyson starts out in 1953 with the Norwegian scientist Nils Baricelli running a first set of digital life simulations, and then leaps back to the origins of the Institute for Advanced Study, and its founding. From there we go to von Neumann, and the early computers. The chapters and sometimes even paragraphs take leaps through time. It sets the stage for a slow and even confusing, yet somehow interesting, first half of the book. But about there a change takes place. It begins with a set of chapters dedicated to various early computations: meteorology, Monte Carlo simulations, and of course those of the atomic and hydrogen bombs. The book is, by the way, very clear on how intimate relationship there was between the development of atomic bombs and funding of computers. Dyson present it almost as a bargain with the devil: the military got the bombs and the scientist the computers. Very interesting, and this is also where I thought I first saw the direction Dyson wanted to take with his book.
That direction is forward, into the future and the far fetching implications of the history discussed in the first two-thirds of the book. It leads into a new realm of life created by the inventors of the electronic computers. Or so Dyson argues. One of the few non military projects running on the early computers seems to have been Baricelli's digital life, and in the last few chapters it becomes clear why Dyson chose to start off Turing's Cathedral with that particular outlier. He is arguing that this is the Origins of the Digital Universe (which is also the subtitle of Turing's Cathedral). Not only do the last chapters add a lot to the intellectual value of the book (besides the interesting historical contribution), they also make the somewhat convoluted path taken through the previous two-thirds of the text worthwhile. Dyson present thoughts - his own, and those of the historic persons in the book - regarding how code is a form of digital life, and how we perhaps are only serving a role as hosts by enabling an environment for it to grow.
It makes for fascinating reading. Not only because it is a fresh yet historic view of the concept on the technological singularity, but because it aims to be much more. The singularity itself is sidestepped, put out of focus, because it only makes sense in a human centered context. What Dyson is after make us both more and less significant. Humanity built machines for computing - an action of both creation and servitude - letting information flow into a completely new realm. It makes me think of multi-stage live cycles often found in nature.
Anyway, from a slow, but historically interesting (albeit, with its focus on IAS, narrow) beginning Turing's Cathedral grows into a very interesting book, suggesting some very radical ideas, even though the author doesn't spell them out all the time. Dyson is a good writer, and I think he is worth the time for anyone interested in the origins of electronic computing.