computer history
Dec. 28th, 2022 11:33 amI was asked over on mastodon whether I had any recommendations for studying either the period of computer history post-WW2 (thus "post invention of stored-program digital computers") or even pre-WW2 (thus "all the random stuff we used before then").
I do! I've put together a short list over here which, conveniently, includes a lot of links to immediately "borrow" digital copies from archive.org. Also if you follow outbound links from those, or load them on a book-recommender site, you'll probably find a lot of good related titles.
Three caveats:
Given those caveats, I will also distil a few observations from what I have read, that might help frame what I pursued or stimulate the potential reader's curiosity:
I do! I've put together a short list over here which, conveniently, includes a lot of links to immediately "borrow" digital copies from archive.org. Also if you follow outbound links from those, or load them on a book-recommender site, you'll probably find a lot of good related titles.
Three caveats:
- That list is very US-centric and there are big, important lists I haven't made that cover computing history as it developed elsewhere. Big chunks of it happened in Europe and Asia, and I am sadly very ignorant of those histories still.
- I haven't read all of them -- some are on an overlapping to-read list or glaring at me from the bookshelf -- but they're all well enough reviewed that I think they're worth including.
- They're very scattershot, following threads I was interested in at one time or another, not a systematic attempt to "learn subject X" completely or faithfully.
Given those caveats, I will also distil a few observations from what I have read, that might help frame what I pursued or stimulate the potential reader's curiosity:
- There is a huge institutional aspect to early computing. Governments and the military foot the bill for almost everything, and giant financial institutions (banks and insurance companies) do the rest. So it is important to understand what the institutions are interested in.
- The birthplace of a lot is WW2. There's no getting around it. There are vivid and lasting developments coming out of often-separate groups concerned with ballistics, intelligence and logistics, code-breaking, radar, and atomic weapons simulation. We are all basically working today with derivatives of the IAS machine -- any time someone describes a current machine as having a "Von Neumann Architecture" you can just mentally apply the footnote "atomic weapons simulation machine".
- There's a lot of tech-transfer of military stuff. Post-war, a lot of labs and R&D groups that got war funding went into business commercializing their stuff.
- There are two very distinct historical threads -- accounting uses and scientific uses -- that literally build different machines for a long time! Business machines and science machines are separate product lines. COBOL and FORTRAN. The IBM "360" project (along with PL/I) is, among other things, a merger of the two product lines into a single one. We still see echoes of this division in (say) the presence of separate binary floating point (science) and integer-and-BCD (business) instruction sets and datatypes.
- In the US there's a big east-coast / west-coast split based on trajectories of specific people and research groups. And the east coast is huge for most of the history! Route 128 is where the action is at for a very long time. Stanford's postwar commercialization of microwave research on the west coast and Shockley's subsequent move to set up shop in Mountain View set in motion a long shift in the center of gravity, but even today MIT is still MIT for a reason.
- Besides the east-west split there's also a fascinating and very significant US-midwest (Minnesota specifically) trajectory to follow around a decommissioned US naval codebreaking group called ERA that is probably the most vivid "huge in the past, forgotten by today's youth" contrast story. This is the group that hired Seymour Cray, worked on contracts for the office of naval research for a while, bounced around ownership by some east coast companies for a bit but eventually settled into being "Control Data" for a while before forking off again as an impatient and independent Cray Research.
- There are two very large and very long-lived companies that dominate the history, that again we tend not to think too much about in recent years: IBM and AT&T / "The Bell System". It's really hard to overstate how dominant these firms were. For much of the 20th century -- and these really are century-old companies, founded in the late 1800s! -- "computer" was really synonymous with "IBM" and "telecommunication" was synonymous with "AT&T/Bell". Nobody could come anywhere near them, everything else was just a rounding error in terms of scale. The fact that both of them lost control of their natural territory around the birth of microcomputers is part of what gives that era such a different flavor (along with, obviously, the temporary transition to a personal / home / hobbyist stance -- much of which has unravelled in the post-2010 modern Big Tech era)
- Older computers were physically huge and were constantly struggling with physical problems: heat like today, but also power, weight, faulty parts, assembly and maintenance costs. And they were often changing technology as new types of memory and logic gates gained the advantage: electromechanical machines, vacuum-tube machines, discrete transistorized machines. The consolidation on integrated-circuit microchips (where the whole computer is one tiny slab of silicon) took a very long time to occur.
- What we nowadays think of as "online culture" or "cyberculture" did incubate on hobbyist BBSs and usenet and such in the late-80s but it actually has its origins in much earlier student use in the 70s and even 60s of institutional time-sharing systems intentionally repurposed for education -- PLATO and DTSS. It wasn't all stuffy science-and-accounting! People were sharing recipes and jokes, teaching each other crude programming languages, arguing politics, playing games and finding love online using mainframes and remote terminals decades before they were doing so with microcomputers and modems.
- As I've mentioned before, any history-reading is necessarily partial, incomplete, biased, overlapping and messy. You will only get part of any picture, and you can follow endlessly-earlier to find more and more material, of sketchier and sketchier certainty and more and more author-imbued bias. Studying early computing can easily drag you back past 19th century mechanical tabulators and telegraph-operator culture all the way to the deep history of logic and philosophy, where people have struggled with "formalizing thinking" (for various reasons) for centuries. Embrace the depth of it! But also don't imagine you ever know it all; anything new you learn that's worthwhile will expand rather than contract the set of things have left to learn.