graydon2: (Default)
Today I was asked -- as I am frequently asked! -- about the origins of something in Rust. In this case the origin of the "borrow" terminology. Not the idea, just the word. Who first used that word?

The short answer to this is "as far as I know, John Boyland, in his 1999 paper Alias killing: Unique variables without destructive reads" (later expanded-on in a more widely cited 2000 paper where he renamed it to the slightly less dramatic "alias burying").

John's paper was read by and influenced the Cyclone folks somewhere between 2002 and 2007. And both of these works were read by and influenced Niko Matsakis when he proposed upgrading Rust's existing and somewhat half-baked alias checker into something resembling today's region/lifetime/borrowing system (at the time this was motivated by soundness issues emerging from interactions with Rust's at-the-time half-baked mutability rules, creating the so-called "Dan's Bug").

This was not impossible to find -- I know a few extra places to look mind you -- but digging it up produced a couple surprises to me:

  • I thought -- and nearly answered the person asking me -- that one of us in the Rust project had coined the term "borrow" as an ergonomic adaptation, because "region" and "alias" are clunky and nonintuitive. This was a retcon on my part! Easy to imagine, but wrong.

  • Even more interesting to me is how many other works there are in the Related Works section of John's paper! Not just things I've seen before[1] (eg. Cyclone, Tofte & Talpin's ML Kit, Ada, Euclid[2]) but also a spectrum of delightfully weird others: Eiffel*, Islands, Balloons, FAP, ESC with its "virgin, pivot and plenary" references, etc. This reference-set pulls at threads happening in multiple teams/projects/institutions, all through the 90s, 80s, even back into the 1970s, eg. here is Reynolds grappling with it in 1978.


Now, I don't want to minimize Niko's work here at all -- he's responsible for doing the lion's share of detailed design and implementation work on Rust's borrowing system in particular, along with many many other aspects of the language -- but I do think this is an illustrative example of something we should all keep in mind: little-if-anything intellectual happens in a vacuum, few-if-any ideas spring forth fully formed. And we're all curiously willing to retcon the origins of such things. Even things we ourselves participated in!

(I am reminded of similar -- much weightier -- arguments that I have read about whether Darwin invented the concept of Evolution, or Einstein of Relativity, or Planck of Quantum Mechanics. On close inspection the person who gets cited is always part of a lengthy and ongoing research program spanning decades, never just "inventing stuff in isolation". But we are all very inclined to imagine it as so!)




[1]: I have elsewhere written about precedents-I-knew-about when working on Rust on my own, and designing the original somewhat half-baked move-semantics and mutable-alias-control systems:

[2]: A genuinely odd coincidence is that the earliest reference I can find to anyone grappling with mutable aliasing control in a PL design -- not just a complaint about how it breaks Hoare[3] logic -- is Euclid, which was a multi-site / multi-collaborator project including the one and only Butler Lampson, but was fairly substantially focused in Toronto (Horning, Holt, Cordy) and published in 1977. I was born in 1977 and grew up in Toronto, so somehow this all seems like, I don't now, something that was in the air?


[3]: No relation. I know, right? How can this be?

PERQ

Sep. 7th, 2024 01:12 am
graydon2: (Default)
A note on the PERQ computer.

technical history muttering )
graydon2: (Default)
I spent another couple days working with talon (and its many extensions) for voice-coding and found a good set of additional extensions and modes: a multiple-named-clipboards mode, a dense grid mouse navigaiton mode, a window-moving mode, and several sub-modes of the standard community library for multiple cursors, find-and-replace, page and tab nav, named abbreviations and websites, and so forth.

I've collected a cheatsheet here if anyone's curious.

As it happens I also spent a few days going carefully through different muscle groups along my arms and back and found a gigantic knot under my right shoulder which I think accounts for most of the wrist trouble. I spent a bunch of time with a lacrosse ball (hint: these are the best muscle-release massage equipment in the entire world and are also very cheap and convenient) and it seems to be mostly fixed now? So my actual motivation to keep working with talon has gone from relatively-pressing to moderate-residual-curiosity and I'm mostly just typing again now. I'll see if this keeps up or if I go back to it!
graydon2: (Default)
My wrist has been hurting a little bit recently, and a friend of mine has been telling me about how he's using Talon (and its extensions Cursorless and Rango) as an interface to programming entirely with his voice, so I thought I'd give it a try. I'm now about three days into using it, and I have it working well enough that I just composed my first real patch that makes a not completely trivial change to a program, and while it took me almost an entire workday to complete a 165-line patch, I can definitely feel myself accelerating and gaining fluency in the way the system works.

(This blogpost is also being mostly composed using Talon)

notes )
graydon2: (Default)
Recently Boats wrote a blog post about Rust, mutable aliasing, and the sad story of local reasoning over many decades of computer science. I recommend that post and agree with its main points! Go read it! But I also thought I'd add a little more detail to an area it's less acutely focused on: formal methods / formal verification.

TL;DR: support for local reasoning is a big factor in the ability to do automated reasoning about programs. Formal verification involves such reasoning. Rust supports it better than many other imperative systems languages -- even some impure functional ones! -- and formal verification people are excited and building tools presently. This is not purely by accident, and is worth understanding as part of what makes Rust valuable beyond "it doesn't need a GC".

The rest of this post is just unpacking and giving details of one or more of the above assertions, which I'll try to address in order of plausible interestingness to the present, but I will also throw in some history because I kinda can't help myself.

long post for the PL nerds in the audience )

reading

Jan. 14th, 2024 01:31 pm
graydon2: (Default)
I'm hoping to do a little more reading in 2024 than I managed in 2023 (at least in part due to work calming down a bit). This week I finished off two books from last year.
blah blah )
graydon2: (Default)
In a recent podcast about Rust leadership, the BDFL question came up again and Jeremy Soller said (in the understatement of the century) that "I believe Graydon would have said no to some things we all like now". And this echoes a different conversation on reddit where I was reminded that I meant to write down at some point how "I would have done it all differently" (and that this would probably have been extremely unsatisfying to everyone involved, and it never would have gone anywhere).

Boy Howdy would I ever. This is maybe not clear enough, and it might make the question of whether the project "really should have had a BDFL" a little sharper to know this: the Rust We Got is many, many miles away from The Rust I Wanted. I mean, don't get me wrong: like the result. It's great. I'm thrilled to have a viable C++ alternative, especially one people are starting to consider a norm, a reasonable choice for day-to-day use. I use it and am very happy to use it in preference to C++. But!

There are so, so many diferences from what I would have done, if I'd been "in charge" the whole time.

differences! )
graydon2: (Default)
Over on the socials, someone asked "Do you ever wish you had made yourself BDFL of the Rust project? Might there be less drama now if the project had been set up that way?"

This is a tricky question, not because of what it's superficially asking -- the answer there is both "no" and "no" -- but because of what I think it's indirectly asking. I think it's asking: is there some sort of original sin or fatal flaw in the Rust Project's governance -- perhaps its lack of a BDFL -- that's to blame for its frequent internal sociopolitical crises?

To that I can speak, hopefully briefly, or at least concretely. I think there are a few underlying causes, and one big pattern.
governance )
graydon2: (Default)
A short note about corporate free / open source software (FOSS), and corporate-employed maintainers. Or specifically "corporate-employed maintainers .. with bad incentives". I tried to come up with a pithy name, but that's the best I could do: CEMBIs.

I've worked professionally in FOSS for a long time. I've seen a lot of corporate approaches to FOSS participation over that time, and seen the FOSS community develop a fairly nuanced understanding of what sorts of corporate strategies are welcome or unwelcome, healthy or harmful to a project. We have articulated a lot of thoughts about (say) open core and dual licensing business models, or (say) which forms of corporate "embrace" represent a step on an "extend/extinguish" path, or (say) which forms of telemetry are appropriate and which put users at risk of surveillance.

What I haven't seen a lot of discussion of, and wish I did see, is the structure and content of relationships that exist between corporate-employed FOSS maintainers and their employers. And I think this matters because the people doing corporate FOSS aren't soul-less automata executing corporate strategy. They are people with their own motivations, incentives, a certain amount of autonomy, but (most relevant to my concerns) a set of performance-evaluation criteria they have to satisfy to remain employed and/or get promoted.

Companies incentivize lots of things, but I worry (and in many cases I either know first hand or have heard second hand) that companies often have an incentive structure that rewards novelty, especially in the form of features if not entire products. There's a good reason for this when the company is "making products to sell": this year's new-and-improved is always sellable over last year's tired old model. But it's also just a reflection of the "growth orientation" or capitalism in general: whatever activity the company accomplished last year, a common measure of health and vitality isn't consistent execution but growth of the business. Failure to grow is treated as stagnation which is treated as equivalent to death. This orientation can be embedded so deep in a company's DNA that it's the incentive structure given to everyone who works there, regardless of whether they're making products, auditing the accounts, or maintaining corporate infrastructure.

And to a large measure, that's what FOSS is. Not always, but usually: infrastructure. It's stuff that's supposed to work the same way from one day to the next. Stuff that's not supposed to be noticed because it just works. Reliably, efficiently, silently. Stuff that has a massive installed base of users relying on it, massive social and institutional inertia, and thereby massive (and sensible) built-in resistance to novelty. You don't actually want novelty in the electrical grid, the drinking water system, sewers, roads, bridges, rail lines, telecoms .. you want this stuff to be absolutely rock solid and not novel in the least. That's what maybe 90 or 95% of FOSS is like, certainly the stuff that needs reliable corporate maintainers.

For corporate-employed FOSS maintainers working at a firm with these "growth and novelty" incentives -- CEMBIs -- this leads to a real quandry. They're in a position where their job performance is very likely to be evaluated in terms of visible growth and novelty (it might be dressed up in more abstract terms like "real-world impact" or "visibility" but it still means the same thing) even though that is exactly the wrong thing for the health of the project they're maintaining. The incentives are bad. If they do the best thing for the project as infrastructure -- triage and fix bugs from the backlog, optimize performance, increase security and reliability, pay down tech debt, simplify and automate ongoing maintenance -- the bias of their organization is likely to view them as "doing nothing", or at best doing "low-value work" that only counts as "reducing fixed costs", not leading the way towards new growth. To be seen in a positive light by their employer, the CEMBI winds uphaving to do essentially anti-maintenance work: make the program "do something new and exciting", that it didn't do yesterday. Ignore maintenance of "what is", focus on "what's next".

Seeing this over and over in FOSS makes me a bit sad. I guess I've maybe been watching too many re-runs of The West Wing lately, but while I've been thinking about this subject, I keep thinking of the "Let Bartlet Be Bartlet" episode near the end of the first season, where the characters come around to the idea that maybe it'd be best to just do what they were elected to do. I keep thinking it'd be nice if employers could incentivize their employees to do what they were hired to do. To "Let Maintainers Be Maintainers". Maintaining FOSS isn't low-value work. It's essential work, stuff that you ignore at your medium and long-term peril. As a new homeowner I will make more salient analogies: it's testing the wiring for faults, testing the walls for mould, replacing the roof before it leaks. It's work that has to be done in order for FOSS to keep functioning over the long term. Software actually does "break down and wear out": requirements change; upstream and downstream platforms and libraries change incompatibly; patterns of usage change; hardware changes; new subsystems become performance bottlenecks; new bugs are discovered, some of which will be high severity security vulnerabilities, others will merely grow in importance as they're encountered more often. Software maintenance is real and important work, and if a company hires a maintainer of a project "in order to support the project", it's what that company should be incentivizing and evaluating those maintainers in terms of.
graydon2: (Default)
On a particularly stressful day this holiday I decided to distract myself by looking into the state of amateur radio (and adjacent issues). I don't remember what specifically set it off, but I fell into the rabbit hole for the next 48h or so and opened a million tabs and read a bunch and learned enought that I figured I'd make some notes here for future reference to myself / picking up if I'm ever interested again.

Reader beware: amateur radio is the natural home of some of the deepest turbo-nerds on earth, and it gets .. a bit heavy.

buckle up )
graydon2: (Default)
I was asked over on mastodon whether I had any recommendations for studying either the period of computer history post-WW2 (thus "post invention of stored-program digital computers") or even pre-WW2 (thus "all the random stuff we used before then").

I do! I've put together a short list over here which, conveniently, includes a lot of links to immediately "borrow" digital copies from archive.org. Also if you follow outbound links from those, or load them on a book-recommender site, you'll probably find a lot of good related titles.

Three caveats:

  1. That list is very US-centric and there are big, important lists I haven't made that cover computing history as it developed elsewhere. Big chunks of it happened in Europe and Asia, and I am sadly very ignorant of those histories still.

  2. I haven't read all of them -- some are on an overlapping to-read list or glaring at me from the bookshelf -- but they're all well enough reviewed that I think they're worth including.

  3. They're very scattershot, following threads I was interested in at one time or another, not a systematic attempt to "learn subject X" completely or faithfully.


Given those caveats, I will also distil a few observations from what I have read, that might help frame what I pursued or stimulate the potential reader's curiosity:

  • There is a huge institutional aspect to early computing. Governments and the military foot the bill for almost everything, and giant financial institutions (banks and insurance companies) do the rest. So it is important to understand what the institutions are interested in.

  • The birthplace of a lot is WW2. There's no getting around it. There are vivid and lasting developments coming out of often-separate groups concerned with ballistics, intelligence and logistics, code-breaking, radar, and atomic weapons simulation. We are all basically working today with derivatives of the IAS machine -- any time someone describes a current machine as having a "Von Neumann Architecture" you can just mentally apply the footnote "atomic weapons simulation machine".

  • There's a lot of tech-transfer of military stuff. Post-war, a lot of labs and R&D groups that got war funding went into business commercializing their stuff.

  • There are two very distinct historical threads -- accounting uses and scientific uses -- that literally build different machines for a long time! Business machines and science machines are separate product lines. COBOL and FORTRAN. The IBM "360" project (along with PL/I) is, among other things, a merger of the two product lines into a single one. We still see echoes of this division in (say) the presence of separate binary floating point (science) and integer-and-BCD (business) instruction sets and datatypes.

  • In the US there's a big east-coast / west-coast split based on trajectories of specific people and research groups. And the east coast is huge for most of the history! Route 128 is where the action is at for a very long time. Stanford's postwar commercialization of microwave research on the west coast and Shockley's subsequent move to set up shop in Mountain View set in motion a long shift in the center of gravity, but even today MIT is still MIT for a reason.

  • Besides the east-west split there's also a fascinating and very significant US-midwest (Minnesota specifically) trajectory to follow around a decommissioned US naval codebreaking group called ERA that is probably the most vivid "huge in the past, forgotten by today's youth" contrast story. This is the group that hired Seymour Cray, worked on contracts for the office of naval research for a while, bounced around ownership by some east coast companies for a bit but eventually settled into being "Control Data" for a while before forking off again as an impatient and independent Cray Research.

  • There are two very large and very long-lived companies that dominate the history, that again we tend not to think too much about in recent years: IBM and AT&T / "The Bell System". It's really hard to overstate how dominant these firms were. For much of the 20th century -- and these really are century-old companies, founded in the late 1800s! -- "computer" was really synonymous with "IBM" and "telecommunication" was synonymous with "AT&T/Bell". Nobody could come anywhere near them, everything else was just a rounding error in terms of scale. The fact that both of them lost control of their natural territory around the birth of microcomputers is part of what gives that era such a different flavor (along with, obviously, the temporary transition to a personal / home / hobbyist stance -- much of which has unravelled in the post-2010 modern Big Tech era)

  • Older computers were physically huge and were constantly struggling with physical problems: heat like today, but also power, weight, faulty parts, assembly and maintenance costs. And they were often changing technology as new types of memory and logic gates gained the advantage: electromechanical machines, vacuum-tube machines, discrete transistorized machines. The consolidation on integrated-circuit microchips (where the whole computer is one tiny slab of silicon) took a very long time to occur.

  • What we nowadays think of as "online culture" or "cyberculture" did incubate on hobbyist BBSs and usenet and such in the late-80s but it actually has its origins in much earlier student use in the 70s and even 60s of institutional time-sharing systems intentionally repurposed for education -- PLATO and DTSS. It wasn't all stuffy science-and-accounting! People were sharing recipes and jokes, teaching each other crude programming languages, arguing politics, playing games and finding love online using mainframes and remote terminals decades before they were doing so with microcomputers and modems.

  • As I've mentioned before, any history-reading is necessarily partial, incomplete, biased, overlapping and messy. You will only get part of any picture, and you can follow endlessly-earlier to find more and more material, of sketchier and sketchier certainty and more and more author-imbued bias. Studying early computing can easily drag you back past 19th century mechanical tabulators and telegraph-operator culture all the way to the deep history of logic and philosophy, where people have struggled with "formalizing thinking" (for various reasons) for centuries. Embrace the depth of it! But also don't imagine you ever know it all; anything new you learn that's worthwhile will expand rather than contract the set of things have left to learn.
graydon2: (Default)

But to see mechanization and automation purely as a problem in comparative cost is greatly to minimize their role -- and to pay further for the error of confining economic goals, and economic calculation, to profit maximization. The technostructure, as noted, seeks technical progressiveness for its own sake when this is not in conflict with other goals. More important, it seeks certainty in the supply and price of all the prime requisites for production. Labor is a prime requisite. And a large blue-collar labor force, especially if subject to the external authority of a union, introduces an element of uncertainty and danger. Its cost is not under the control of the technostructure, although in the planning system there is, of course, the power to offset labor cost changes with price changes. There remains the risk and consequences of a strike.

In contrast, mechanization adds to certainty. Machines do not yet go on strike. The prices are subject to the considerable certainty which, we have seen, is inherent in the contractual relationships between large firms. The capital by which the machinery is provided comes in large proportion from the internal savings of the firm. More white-collar workers and more members of the technostructure will be required with mechanization. But white-collar workers tend to identify themselves with the goals of the technostructure with which they are fused. Such is the result of replacing twenty blue-collar workers with two men or women knowledgable in computers.

Juneteenth

Jun. 19th, 2020 06:24 pm
graydon2: (Default)
[CW: slavery, white supremacy, violence]

I had a long-form rambling blog post here about Juneteenth that I ultimately didn't like the tone or arrangement of. But I did like some of its content, so I'll try to reproduce that here in a significantly shorter, hopefully more-readable point-form post.

some reflections )
graydon2: (Default)

If children survived to age seven, their recognized life began, more or less as miniature adults. Childhood was already over. The childishness noticeable in medieval behaviour, with its marked inability to restrain any kind of impulse, may have been simply due to the fact that so large a proportion of active society was actually very young in years. About half the population, it has been estimated, was under twenty-one, and about one third under fourteen.

Barbara Tuchman - A Distant Mirror, The Calamitous 14th Century

graydon2: (Default)
Observations noted in From Krivine’s machine to the Caml implementations (Leroy, 2005):

In every area where abstract machines help, there are arguably better alternatives:

• Exposing reduction order: CPS transformation, structured operational semantics, reduction contexts.
• Closures and environments: explicit substitutions.
• Efficient execution: optimizing compilation to real machine code.
• Intermediate languages: RTL, SSA, CPS, A-normal forms, etc.
• Code distribution: ANDF (?).

Abstract machines do many things well but none very well.

This we will illustrate in the case of efficient execution of Caml.

[...]

In retrospect, the journey could have been shorter by not going through abstract machines at all.

But maybe it would never have taken place without the "detour" through abstract machines.

Observations noted in The Verified CakeML Compiler Backend (extended version, Tan et al., 2018):

The previous compiler compiled from source to a single IL, then to stack-machine-based bytecode and finally to x86-64. The bytecode was designed so that each operation mapped to a fixed sequence of x86 instructions, and it was also designed to make verification of the GC as easy as possible. Unfortunately, the ease of verification also meant that the compiler had poor performance – we found the bytecode IL too low level for functional programming optimisations (multi-argument functions, lambda lifting, etc.) and too high level for backend optimisations. For example, it naively followed the semantics and allocated a closure on each additional argument to a function, pattern matches were not compiled efficiently (even for exhaustive, non-nested patterns), and the bytecode compiler only used registers as temporary storage within single bytecode instructions. The new version addresses all of these problems and splits each improvement into its own phase and IL in order to keep the verification of different parts as separate and as understandable as possible.

(emphases mine)
graydon2: (Default)

Rust 2019 and beyond: limits to (some) growth.


This is a blog post (as solicited) about my suggestions for the Rust project in 2019 and beyond.
observations and advice )
graydon2: (Default)
God I love this book!


Until the end of World War II or shortly thereafter, planning was a moderately evocative word in the United States. It implied a sensible concern for what might happen in the future and a disposition, by forehanded action, to forestall avoidable disfunction or misfortune. As persons won credit for competent planning of their lives, so communities won credit for effective planning of their environment. It was thought good to live in a well-planned city. The United States government had before the war a National Resources Planning Board. During the war, postwar planning acquired the status of a modest industry in both the United States and the United Kingdom; nothing else, it was felt, would so reassure those who were fighting that they had eventual utility as civilians.

In the Cold War years, however, the word planning acquired grave ideological overtones. The Communist countries not only socialized property, which seemed not a strong likelihood in the United States, but they planned, which seemed more of a danger. Since liberty was there circumscribed, it followed that planning was something that the libertarian society should avoid. Modern liberalism carefully emphasizes tact rather than clarity of speech. Accordingly, it avoided the term, and conservatives made it one of opprobrium. For a public official to be called an economic planner was less serious than to be charged with Communism or imaginative sexual proclivity but it reflected adversely nonetheless. One accepted and cherished whatever eventuated from the untrammeled operation of the market. Not only concern for liberty but a reputation for economic hardihood counseled such a course.

For understanding the economy and polity of the United States and other advanced industrial countries, this reaction against the word planning could hardly have been worse timed. It occurred when the increased use of technology and the accompanying commitment of time and capital were forcing extensive planning on all industrial communities -- by firms and of firms' behavior by government. The ban on the use of the word planning excluded reflection on the reality of the planning.

This ban is now in the process of being lifted -- much has been accomplished in this regard in the eleven years since the first edition of this book appeared. The need for national planning has become a reputable topic for discussion, as also legislation to facilitate it. On a matter such as energy the need is accepted but in circles of the highest repute the term czar is still preferred to that of planner, though not, one judges, because it is deemed more democratic.

However, it is still the instinct of conservatives and those for whom high banking or corporate position serves as a substitute for thought that anything called planning should be resisted. And perhaps there are useful elements of self-interest in the effort. Any discussion of planning by the government will draw attention, inevitably, to the planning by corporations that makes it necessary. Those who now, in the manner of all planners, guide or control the behavior of individuals will no longer be able, on grounds of high principle, to resist public guidance, control or coordination of their planning.


graydon2: (Default)

I am also concerned to show how, in this larger context of change, the forces inducing human effort have changed. This assaults the most majestic of all economic assumptions, namely that man, in his economic activities, is subject to the authority of the market. Instead we have an economic system which, whatever its formal ideological billing, is, in substantial part, a planned economy. The initiative in deciding what is to be produced comes not from the sovereign consumer who, through the market, issues the instructions that bend the productive mechanism to his ultimate will. Rather it comes from the great producing organization which reaches forward to control the markets that it is presumed to serve and, beyond, to bend the customer to its needs. And, in so doing it deeply influences the values and beliefs -- including not a few that will be mobilized in resistance to the present argument. One of the conclusions that follows from this analysis is that there is a broad convergence between industrial systems. The imperatives of technology and organization, not the images of ideology, are what determine the shape of economic society. This, on the whole, is fortunate, although it will not necessarily be welcomed by those whose intellectual capital and moral fervor are invested in the present image of the market economy as the antithesis of social planning. Nor will it be welcomed by their disciples, who, with even smaller intellectual investment, carry the banners of free markets and free enterprise therewith, by definition, of the free nations into political, diplomatic, or military battle. Nor will it be welcomed by those who identify planning exclusively with socialism.

[...]

Accordingly, in the later chapters I turn to the effect of economic change on social and political behaviour, and to remedy and reform. As noted, I am led to the conclusion, which I trust others will find persuasive, that we are becoming the servants in thought, as in action, of the machine we have created to serve us. This is, in many ways, a comfortable servitude; some will look with wonder, and perhaps even indignation, on anyone who proposes escape. Some people are never content. I am concerned to suggest the general lines of emancipation. Otherwise we will allow economic goals to have an undue monopoly of our lives and at the expense of other and more valuable interests. What counts is not the quantity of our goods but the quality of life.

John Kenneth Galbraith -- The New Industrial State

graydon2: (Default)
It's Nov 12 (which is the skip-a-day dislocation of Nov 11 for weekday-holiday reasons), and it's 2018. Which means it's 100 years after the Armistice was signed ending the Great War, a.k.a. WWI.

This time of year I usually post something a bit scold-y about our failure to remember sensibly: about militarism and nationalism, jingoism, the distortion and glorification of a day that ought to mark our greatest shame and horror. Here and here are some representative scoldings. Or here in a more sarcastic form.

This year I thought I'd be more constructive, and relate the story of my own relationship to changing opinions on this matter. Because I would like to encourage anyone reading (especially younger people!) to take some time to do the same. It takes time and effort, but it's well worth it.

When I was in my mid 20s, I really had very little idea about the past, at least not much beyond a couple highschool history classes in which old-timey things were placed in some order and said to have happened in the before-color-TV era of the 1900s, or 1800s, or .. well gosh I couldn't really put my finger on which things and when. I read a bit of politics, but it was mostly of the radical and idealistic kind, and I was leaning vaguely towards (embarrassingly) neoliberal free-market beliefs about economics due to exposure to The Economist as a magazine that at least tried to report some kinda-global news every week, in between their editorials about "flexible labor" and "sclerotic regulation" and so forth. I had only a weak sense of how economic policy interacted with power, institutions, politics, human or civil rights, social justice, etc. etc.

At some point I was on vacation in Europe (by embarrassing coincidence, reading Neal Stephenson's Baroque Cycle books) and I was visiting places with quite a bit of history on display, and I was increasingly realizing that (a) there was a lot more time with a lot more meaningful structure to the goings-on between (say) 1600 and 2000 than I had formerly been willing to acknowledge and (b) I was terribly under-informed about a lot of it. So I picked up a book that tried to do some History For Real (randomly: Roberts' History of the 20th Century) and devoured it.

This book astonished me. The 20th century was so full of absolutely bizarre, incomprehensible carnage, massive migrations of people, disintegrations and creations of empires, redefinitions of the global order .. I could scarcely believe how different the world of 1900 was from the world of 2000. It takes serious mental work to get a grip on, and everything I learned opened up a hundred new questions ("so X was Y at this time .. ok .. but how on earth did it get that way?")

So when I returned home I realized this was going to be like my main reading topic for the next while. And that while turned out to include a pretty focused ten-ish years (most of the duration of the Rust project, oddly) and a less-focused but ongoing interest through the present. Each book and each set of questions led to another, and I tried to put together a passable understanding of larger pieces of the historical puzzle. Hobsbawm, Braudel, Himmelfarb, Fisk, Harvey, Thomas, Taylor, Said, Polanyi, Bayly, Bulliet, Mumford, Tuchman, Holmes, Phillips, Fairbank, and on and on. I also got quite into listening to audiobooks, lectures (especially from The Teaching Company) and online courses (especially from UC Berkeley, though many of these have been discontinued or migrated to iTunes U). By now I'm perhaps at the point where I have a rough picture of a few hundred years of a few dozen major polities; and at least a smattering of the major signposts in the millennia or two proceeding, though it becomes increasingly fragmented and biased as it goes back.

Reading history is a never-ending process, but even the first few years engagement in the matter really sharpened my politics. The details make matters of policy that were previously vague or confused seem clear and vivid. It is one thing to have an abstract sense of the tension between (say) labor and capital, but quite another to have a timeline laid out of a century and a half of actual warfare over it, its reorganization of continents and empires. It's one thing to think vague thoughts about European imperialists overpowering colonized peoples sometime in the past, but quite another to go through the events country by country, massacre by massacre, famine by famine, dislocation by dislocation. It's one thing to know there was an Atlantic slave trade and that it was a terrible injustice, but quite another to read of the centuries of traffic, the thousands of ships, the millions of lives.

Reading history has also made me frustrated at the selectivity of memory. Especially on days like this, where so little about the circumstances and meanings of WWI is given the slightest discussion. And it does make me annoyed anytime someone tries to make a point in the present by waving their hands and saying "look at history!" as though history-as-a-whole supports them, as though there were a single obvious lesson to be learned from all of it. It has terrible depth and complexity, and one must always strive to read multiple treatments of a topic, by multiple people with multiple biases, multiple approaches to telling, emphasizing, omitting. Historiography -- the history of history-telling -- is a whole additional meta-field I have barely scratched the surface of.

But all that aside: I strongly recommend making the time for it. It's enriched my life in ways I never would have expected. Politics, sure; but also understanding and appreciating people I meet in life, places I visit, events in the news, the context of my life, my home country, my place in the systems of the world. I only wish I had taken up the habit earlier. Many people have the impression the entire topic hinges on military history (and popular writing definitely does over-produce military history) but historical writing is so much bigger and more interesting than that. Ignore the military stuff, you'll still have a vast and fascinating field to explore.

(It's especially fun to read in the modern world, where we have access to basically perfect mapping technology plus all of wikipedia. Any thread you want to tug on while you're reading, you can stop and read endless additional details on.)
Page generated Jan. 23rd, 2025 04:47 am
Powered by Dreamwidth Studios