When I arrived at Toyota-Volvo of Keene[1], New Hampshire, this morning, there was already a palpable buzz amongst the service staff about the man I have been scheduled to meet there. The wheels of warranty-repair justice turn slowly, but the tech that Toyota have[2] dispatched for me is the man they call when no one knows what to do, he is a fixer, he is the Wolf; I am half expecting Harvey Keitel to walk in the door.
I don’t know how you spent your teenage years, but mine involved lining car door panels with Dynamat to muffle the resonance of too much subwoofer. I never exactly grew out of that phase. When I drive near you, you’re probably going to hear it.
So when I found out last summer that my GR Corolla would have what Toyota optimistically terms the premium audio package[3], I was pleased, because at the very least that would give me six speaker mounts in front. And, when I got laid off two weeks after I paid cash for the speedy little bastard, I figured the sound quality was competent, if not inspired — enough to carry me until I found suitable employ to justify building it out.
When the tech arrives, he’s not Harvey Keitel, but he is full-on Bah-ston; he has run up from Worcester, Mass., in the rain. I shake hands, and then ask: “Did you bring one?” and on his nod run out the door without further social grace because this is my first chance to see another GR Corolla. And there it is, a few spaces down, from mine.
Not-Keitel’s GR Corolla is black. Somehow it looks forlorn. I feel unmoved. Then I feel sad.
“I think it looks better in white,” he says, catching up with me, and I, long-time hater of white cars, realize that I agree. The styling beef that Toyota has glommed onto the GR to differentiate it from, well, the Corolla that it actually is, seems to disappear into the black. There is a crack in the windshield longer than my arm, two of the four wheels are missing valve-stem caps, and it’s raining.
The only other GR Corolla I have ever seen.
One day last November or so I went to drive my car and everything sounded terrible. I re-downloaded all my music in highest quality, munged settings, tried different inputs, all the boring crap you do when you’re trying to be scientific about something, hell I’m bored just writing this sentence. Nothing alleviated the portable-tape-deck sound that I was suffering out of the rear speakers.
I took it to the dealer in January. Several techs rotated into and out of my car, listening, shaking their heads, exchanging yeah, this is messed up looks. No one can put their finger on what exactly is wrong with my sound system but everyone who listens to it emerges feeling depressed. It is generally agreed that it is broken.
After this it takes months to arrange today’s rendezvous.
I have this recurring stress dream in which I am unable to figure out how to use my phone. It is an emergency, someone insistently needs medical attention, but I can’t find the app to make a phone call, I can’t seem to align my fat fingers to dial 9-1-1. This anxiety now plays out in real life in Keene, New Hampshire.
We need to reproduce my problem in a car that is not my own, ergo the tech bringing his own GR. I am unwilling to pair my phone to the not-my-GR because it took me hours of menu-diving, app downloads, network switching and restarts to get paired to my own. The tech’s phone doesn’t have signal, so he can’t stream the song I played in my own car moments before (the only music he has locally is Pearl Jam. I’m not going to touch that). We find that it may be literally impossible to tune the radio manually to get a local station — I found a buried sub-menu, but you have to maybe type in the frequency? Only the digits 3, 4 and 0 are available, the others are greyed out. This is really happening. Is this real life? — and we consider connecting my phone using a cable, but his phone is lightning and mine is USB-C, so we can’t. Easily twenty minutes go by in this fever nightmare. Finally we realize that the car has Sirius XM service and I dial in an appropriately stupid electronic-dancey station, sufficiently thumpy.
We start fading the sound to the back…and…
it sounds exactly like my car did.
The tech and do a long, slow burning stare at each other, his mouth open and eyes droopy at the bottom, a little like Huckleberry Hound. We don’t say anything for several seconds and I can hear the just-barely sounds of the rain over the near-nothing that is happening from the rear speakers. “Wow,” he says eventually, almost reverentially. “That sounds really terrible.”
“Maybe they’ll send you a stick you can just plug in and everything will be better,” offers the deep-voiced, white-mustached, appropriately begrimed man who seems to be in charge of the service bay; the distant look in his eyes suggest that has seen things. He’s come in to shoot the shit with the tech because, like I said, the guy carries a lot of weight and everyone seems happy to see him. St. Patrick’s Day plans are discussed.
We’d been standing there, the three of us, in the service department, for several minutes, and the general consensus is software. Has Toyota pushed an update that neutered their own sound system? Did it somehow always sound this bad and I didn’t notice for months? Theres’s a gloominess, maybe it’s the weather. I have no reason to still be there but can’t seem to leave.
“Well,” offers the tech, “at least Toyota doesn’t hijack your speakers. I had a guy with a loaded Tundra last week who was pissed off at how bad his JBL system sounded. It was because his music was being drowned out by piped-through synthetic V-8 engine noises.”
Eventually I drove home because what else is there to do, never once touching the speed limit, behind dump truck, garbage truck, garbage truck, tractor trailer, erratic but ponderous Sentra, garbage truck again. Road mist and rain, the sky looks filthy, everything seems very glued to the ground and a little dream-like.
At least my engine noise is real.
Correct. Toyota-Volvo. I didn’t even know that combination was possible; it’s like pickles and milk. ↩︎
I’m leaving this subject-verb agreement as it lies, with the risk of sounding affected. I spend a lot of time around British people. ↩︎
I had no sway in the build-out configuration or color (white) of my car. I am lucky to have it at all. ↩︎
The entire point of this series was that it would allow me to get here and then talk at you about how a year ago I rebuilt my entire old site in Next.js blah blah [time passed] um dev ergonomics, strict typing then subsequent personal crisis about what static-website really means and everyone hates Next.js and React now and start over and finding closure in technical compromise with Eleventy TADA and you see my new website before you herk blah boring words.
Forget it. I can write a separate post or, hell, fifty of them explaining all that. You probably don’t care, and that’s just fine.
I realized that, no, actually, the thing is: I could say anything to everyone if I wanted to. It’s my web site, and my regrettable content.
Hello, World.
Again.
What comes next?
]]>ME: I’ve got to update my site.
ME: ...I’ve really got to do something about my site.
ME: I need to put something new on my site.
ME: ...It is getting embarrassing.
ME: I gotta update dependencies so I can get my site to build again.
ME: I gotta re-architect a bunch of my site that I can update the dependencies to get it to build again.
ME: I gotta entirely re-implement most of my web site to make it build again with updated dependencies on a newer version of Node.js.
ME: ...A bunch of those dependencies don’t even exist anymore.
ME: ...I gotta make a new web site.
ME: Aw crap, I’m going to have to start from scratch.
MY HOSTING PROVIDER (NETLIFY): We are turning off support for your incredibly ancient version of Node.js in X weeks.
ME: (muffled noises of despair)
A WISE LITTLE BIRD (very much NOT ME): Why don’t you just curl
-crawl your entire site and literally serve it as static HTML pages?
And lo, for the last several months of 2023, Lyza.com was basically the result of a Save As... operation: just HTML and CSS.
...But wait: how is that any different?
No one could even tell. Because I hadn’t said anything in so long.
]]>I am unable to adequately tell you about mud in Vermont. I want to try, but it’s useless and it frustrates me. Like the northern lights or fireflies, mud season is a you-had-to-be-there phenomenon. What one experiences, in life, as oh hot hell I’m going to high-center right this moment or imminently slide into the river just renders as a few placid, shruggable undulations in photographs.
See? It’s infuriating.
Both cars mud-ice impacted in their wheels and brakes and pissed off about it; they need to be soothed. The road was still frozen this morning, sparred with ice crystals, driveable. I think: If I can beat the road, I can get the VW to the village car wash and back again and maybe it will stop driving like it’s on spin cycle with an unbalanced load.
As I mince-flail-mire toward the highway, I (hallelujah!) confront a Town grader, and it hups up into a snowbank to give me room to pass. I wave, hoot, thumbs-up, wild with gratitude. This buys me time — it should be able to do a pass before I’m back with the VW and maybe, just hells-yes maybe, I’ll be able to attend to the GR, too.
With the Golf, I opt for the automatic side of the carwash. This turns out to be a mistake. It lures me in, smears the car with a fleece of blinding foam and then lights up the green “OK, we’re done here” light and stops doing anything at all. I stick my head out the window to see and mince around to the manual side, pay again, wash it all off by hand. It is one degree below freezing.
Then, later, not much, I’ve made it home, swapped cars, yes!, and, then, once again in the village, feeling punchy, I hard-turn into the Dollar General parking lot[1] like an asshole. Fortunately the Tacoma I’ve hooked around rudely belongs to my friend CP and his squiggling happy puppydawg. I tell him that England last week was just great but that I have returned to a hellscape of mud. He tells me it’s never been this bad, the mud. He grew up directly above my house, like literally straight up — you should see the view up there — and his parents still live there and he tells me that no one has ever seen anything like this.
Vermont is the most Wish You Were Here state in the country, it’s legitimately like the postcards, but mud season undoes it. It’s like phenological puberty.
I’d just finished shammy-ing off the GR after a full manual wash (the car dry, me drenched) and was back on my way into-through the village when my car read out a text from an immediate family member informing me of a concerning, immediate medical situation with another immediate family member. I pulled into the village green next to the pie shop, stared out across the highway at the public tomb and the whole timbre of the morning pivoted so instantly it’s like I’ve just woken up. It’s too windy and I am unmoored by the sudden frantic love I feel for everyone in my life. I am so small and huge. To hell with this (please let it never stop).
Then I know for the next while I’ll be useless for anything but driving. I’ve already got the GR under me, unstabled, clean, on snow tires but still pliant. I suffer an irrepressible need to take the Grafton Road because it’s perfect. To Grafton village, seven miles, then back, and then I do the trip again. The first pairing to set the lay of the road conditions in my mind, the second to fly. Window just cracked to hear the sound of the car, keeping just tight enough on curves not to throw myself into the forest. There’s nothing but shoulderless road, trees, patchy snow, inclines, declines, the occasional back-set old house, pond. I encounter only one other car.
Pausing for a "Shrubbly" soda at the Grafton Village Store. All of Grafton looks this precious. It’s ridiculous.
It’s still not enough. South, to I-91 and then south again, south of Bellows Falls, stopping at Allen Bro’s in the tawny winter marshes next to the Connecticut River, to stand for a minute inside the shop with the people milling around buying coffee and sandwiches and knowing each other, being awake, alive. The wind steals all of my hair with it when I step outside again but gives me in recompense the scent of cider doughnuts.
North again on the Vermont-empty freeway, I tap up into three digits briefly to see if the GR will come unseated in the unsettling wind gusts. It doesn’t. Cautious, but I’d cleared the stretch for state troopers on my southbound pass.
Southern Vermont keeps unfurling for me willingly but I’ll need to set myself down somewhere. And so into Springfield, to BRIC at the haunting old Park Street School, here, where I can write it down, say it in public and regret it. Or not.
This tangential errand because I left every charging cable I own on an airplane a few days ago, my fastidiously-crafted kit. Good work, Gardner. DG had twenty-four kinds of USB cables and every single one of them was USB-A-to-C. My car is new enough that it only does business in -C, so that side trip was for naught. ↩︎
I had shoes once, sneakers, which I had thought were red when I bought them online, but they were pink. No, not pink. Those shoes were piiiiiiiink. So pink they seemed to make a noise when you looked at them. Not (pink) demurely spoken just between the two of us, but a honking hot marching-band donnybrook of color. A hue that came out swinging, that looked like it wanted to rub off and stain things.
It was my friend Autumn who nailed it: “Those are pank.”
This is pank.
And pank it has been, my web site, since 2015. For Lyza.com, I think it’s still the right color. For now, at least. I may change my mind within the fortnight. It’s the perfect color because it makes me uncomfortable; it fits because it doesn’t fit me well. It’s simultaneously unserious and vengeful-feeling, visceral, a little bit rageful, and more than mildly feminine (for which quality I have, at best, ambivalent feelings). It nods to the printer’s-red-and-monochrome of where I was coming from. It is neither welcoming nor off-putting, and can play, again, like printer’s red alongside the no-curves-thanks hard grid of my web site, circa 2015 and today. The color itself is a curve.
I built a new site, finally, in the summer of 2015, motivated not by a desire to make content once again, but instead a hot-breathing urge to hack, to create the personal-ultimate static-site generator and personal publishing workflow that kept my source content, whatever that might be, who cares, really, pristine, portable, human-readable, sacrosanct. I got that part, the content-separation thing, right. And the site was battened down tight and outrageously, as we said then, performant, with new-at-the-time Service Worker-based optimization and obsessive tuning.
Lyza.com during the peak-pank years.
But some other choices I made were less admirable: dependency-heavy, invent-my-own wheel JavaScript written at the height of the odd industry obsession with JS streams (e.g. GulpJS metaphors); I had no real plan for what was going to go on the site, just an eye toward how impeccable its bona fides would be to other web nerds.
It was a fun, feverish hobby of implementation for a couple of months, and I launched it and then I — neglected it entirely. What little content I did produce was...kind of boring? I listlessly popped out a few non-technical posts just to fill the blog page with something.
I’d built a ravenous pank content machine and given it nothing to devour.
Then I got distracted, the kind of distracted — writing another book, this time entirely solo; moving suddenly across the country to the woods of Vermont; facing down re-surgings of poor health and new chronic ailments — that didn’t lift for years.
]]>I used to say anything to everyone.
For a decade and half of another I dropped any thought I had at the feet of the whole world. I didn’t pause, fret over audience. I plowed my fingers into keyboards and published at brave speed. A vignette about walking to work or photos of things upside down in reflections. Minutiae about my chronic illnesses. Complaints. Oho!, the complaints! The peeves! The snark! I did not care that no one cared what books I read or what I thought about them. I told everyone my thoughts about books with total lack of self reflection, or spent a week only posting about the planet Jupiter. I both hate and admire myself for all that content.
I wrote the last content that I would write for four years in the spring of 2011, a book review of Undaunted Courage by Stephen Ambrose.
Lyza.com on ice: spring 2011 through 2013 This is still what you will see on the landing page if you visit archive.lyza.com.
Then: silence. Lyza.com, unbreathing, for two-and-a-half years remaining exactly like this. Like the childhood bedroom, preserved, of someone who has disappeared. That’s too morose. But it’s not entirely off the mark: the sometimes-cringeworthy, often of-limited-general-interest, self-indulgent, unfiltered and completely brave content stream ceased that spring and has never resumed.[1]
This is not at all indicative that the Web[2] and I parted ways. In fact, we were more best buds than ever. It was during this time that I co-wrote my first book about the web, that I started speaking and writing about the web for various publications and traveling, in relation to the web, and that Cloud Four was starting to grow some proverbial legs.
Yet with my own web site, my online persona: a stagnant fug of self-doubt and creeping shame metastazised. I became, in real life, I think, very slowly, a kinder, humbler, but far more boring person, with adult-ish regrets and insights. I think I believe now that I can aim to be only one or the other: creative, careless, self-obsessed or compassionate, thoughtful, wise, a little dull. And that latter path requires a tincture of silence and reservation.
Then again, I took that to an extreme. In early 2014, when the burden of keeping my hand-cranked WordPress plugins and PHP versions maintained became too much hassle for me, and after my WordPress installation was the target of multiple hacking events, I folded like origami and took my site down to only the most minimal bit of printer’s red and monochrome.
Yeah, that is the whole of Lyza.com, 2014. The only marks of humanity here are a nod to the traditional printer’s red and my inability to quit Caslon.
From here, just a glimmer of a hint of a nucleus of a nascent verve — that red is a blue red, a slightly hot red. It is not that many hues away from the pinkest of the pink. So hot of a pink, in fact, that we like to call it pank.
Yes, as you probably inferred, a major life event impacted this. I hope you won’t be too disappointed that I am not going to share what that was. Even writing publicly that I am not going to write publicly about something is hard for me nowadays. i.e. This. This is hard. ↩︎
I am really trying to wean myself of the antiquated stylistic habit of capitalizing web, but in certain cases it still seems to warrant it. ↩︎
I have a habit of holding my breath when I look at Lyza.com as it was in 2010. As if these pages would go to dust if I exhaled on them. I'd done the content genesis for years; now I gave it a lavish coat of finery.
The salad days, Lyza.com, early 2010
In retrospect, I think I got it right. I certainly tried hard enough:
...building and editing the photo illustrations throughout the site took me an estimated 100 hours... Nearly all of the items you’ll see in the images as you bop around the site are photos of objects in my home that I have some attachment to: books, anachronistic tools, papers, brass weights, engravings from old books, knick knacks and rocks.
I built exactly the site I wanted, for exactly that time. I went full deep, developing my own plugins for data retrieval and caching from remote APIs; photo galleries, photo posts, photos photos photos. The presentational reinvention of my site coincided with my acquisition of the new-at-the-time Canon EOS 5D digital SLR, which was a phenomenal, revolutionary camera (and I dislike the word revolutionary). This was when I finally finished my long, griefed transition away from 35mm film (Fuji Velvia 50ISO reversal film was my go-to beforehand). I was photo-drunk. It shows.
I indulged my then-current letterpress obsession by getting very persnickety about an all-Caslon font stack — system fonts only, though — achingly crafting tabular layouts and bulleting with fleurons. I was emulating Robert Bringhurst’s Elements of Typographic Style.
A detail of a post about books shows the depth of absurd typography I went to. I still get prickles from Caslon’s italics.
Would I do it differently now? Barely. I wouldn’t use that weird “magic book” image on the right side of the header[1], nor the creepy and self-worshipful analemma-halo thingy, and I’d take more care with the size and contrast of the navigation items. I wouldn’t use the BluePrint CSS framework or jQuery or CSS sprites, not because there was anything wrong with those tools at the time, but because the web is more better now and we don’t need them anymore. I wouldn’t use WordPress now, or really anything that shatters my content around a relational database, but that’s not a knock on WordPress, it’s just personal choice.
In my lookback this week, I assumed all of this visual faffery would cost, that my site then would have been a blue whale of a performance turd, but it looks like I did this:
The site, while visually intensive, scores an A on Yahoo!’s YSlow scale. To achieve this I’ve focused on GZIPping content, concatenating and compressing JavaScript, using CSS sprites and other tactics to improve performance.
Lest this entry in this series feel excessively self-contratulatory, the Beautiful Years didn’t last. Within 18 months a permafrost had set in that would mark the end of my life as a fretlessly public person. Forever.
Oddly, that was the one image of just a few that weren't my own used in the design that I paid for; apparently I bought it from iStockPhoto. ↩︎
I dragged my feet on writing this post in the series about Lyza.com’s history because I thought I’d be describing the site’s embarrassing Dark Ages, two years when my content wasn’t hosted on my own site, followed by a couple more years of something that looked even less considered than an afterthought.
I'd had it backwards. Everything until April, 2005, was the Dark Ages.
As, this week, I reconstructed these years (2005-2009) forensically, I was startled by the riot of content I was creating — typically several blog posts per week. The method of hosting, the visual blandness, the lack of hacker élan — these aren’t material to the manfiest outcome, which was: content.
Everything I ever put on my site leading up to and during the “Celeste” years, no matter how lovingly-crafted and genuine and naïve and enthusiastic, those things are gone, dead. Dumped to a CD-ROM from which they have never re-emerged. In 2005, Lyza.com died[1]. But then it never died again.
In the spring of 2005, I published a brief blog post, presumably explaining why visitors were being shunted to a blogspot subdomain:
…I need a place to stash my thoughts while I develop my new software. Don’t be upset, it will all end well.
And it did end well. Or, it didn’t end, more precisely, because nothing on the web ever ends, if done right.
Yes, my site redirected to blogspot for two years. And, indeed, the two years following that, while back on Lyza.com soil, were bare-bones and structurally influenced by my adopted CMS of choice (WordPress, that juggernaut).
But that post of April 21, 2005[2] and every single one that I ever excreted afterwards is still alive, still served on Lyza.com at its original URL[3]. For better or cringe-ingly worse.
Lyza.com redirected to a Blogger blog for a couple of years. This capture is from February, 2006
In summer, 2006, I conceded that I wasn’t entirely content with the state of things (cobbler's children, shoes, that kind of thing):
...I’ve been spinning my Web wheels for nigh on two years now...
Even if it’s this stupid blogger blog for now, it has to be something. My vision for the perfect Web site for myself–someone who would likely demand that it serve as a personal killer app–is an ever-creeping morass. Even if all I get out there for now are murmors, those are louder than my useless silence.
In the fall of 2007, content is being served once again from Lyza.com.
Fundamentally, I’d had my priorities straight: content first, fripperies later. And, my, what fripperies they’d be.
I conjecture (but cannot actually precisely recall) that the reason for the sudden shift from assiduously self-crafted bespoke web software to apathetic blogger instance was because I took a job at Intel and my former employer was probably not so keen to continue all the custom hosting and infrastructure required, which they had been, kindly, doing for years, for free, to that point. ↩︎
These archival blog posts will render with a theme from a later incarnation of the website. We'll get there. ↩︎
Which indicates that at some point I exported all of the content from the proprietary blogger CMS and imported into my own self-hosted WordPress. I do not precisely recall doing this but it sounds like something I would have done and rings the faintest of tinkly bells. ↩︎
This — late 2002 through 2005 — is an era of Lyza.com that I look back on with a condescending but genuine sense of affection for my enthusiasm, youth, dumbshittery. Basically I wrote a bunch of blog software. Again. But I still wasn’t calling it that because I was intent on reinventing a galaxy full of wheels. Tada! It’s Celeste. Look on my Works, ye Mighty, and despair!
The summer of 2003, WITH CELESTE
I recall my friend P.H. asking “What is Celeste, after all?” and — this is glorious — it’s treated as a defined term throughout the three years it blazed forth from the header. You're supposed to just know. It’s like an inside joke, but without any humorous intent on my part. I made some blogging-software-avec-yet-more-photo-database-hoohah and I wanted to call it something. The sibilant, refined-sugar sounds of Celeste appealed, and were a nod to the heavenly elements of the site’s design — it looked different depending on time of day and weather (in Portland, Ore., my hometown). See?
Hubris, maybe, bike-shedding and twiddling, definitely, but there was one thing I was doing unambiguously right: I was still making my own website, and holding all of my own content and data. Unfortunately, that was about to change.
]]>The Internet Archive’s Wayback Machine took its first snapshot of Lyza.com on March 31, 2001. I could call the time between 1997 and this 2001 crawl the Dark Ages of Lyza.com because it’s undocumented, and, consistent with the theme, the site’s background was literally black for the first year or two.
I remember animated, glowing, purple accents — I was still a teenager — and perhaps dalliance with the brand-new <FRAME>
tag. Then later, definitely, DHTML to make positioned drop-down menus and, if insane memory serves, a car that floated across the viewport (I liked cars, still do).
What this was: joy, unhindered by wisdom. Just getting something online. Self-doubt would come later, in spates. But for those years, everything was wonderful and ugly.
This is the Wayback Machine’s recollection of my website on March 31, 2001.[1]
Themes: self-absorption, photography and over-reliance on humanist-geometric typefaces, manifested as an ongoing obsession with Futura.
This is one of the few times of my life that I, a chronic un-fun-haver, can say that I was having fun. (I was also entirely miserable). I won’t speak to the reasoning behind the apparent lowercase letter-spacing being committed here; I cannot recall my own agency in the offense. Let’s go ahead and gloss with forebearing grimaces right over the palette — though I’m sure those are all web-safe colors.
This homespun nonchalance would persist until 2005: no frameworks, no tooling, no CMSes. Just my own naïf’s PHP, a MySQL-backed photo database, an overabundance of Photoshop gimmickry (transparent GIFs!, and, O!, the layer effects and glass filters!) — but again, all of my own hand. Stolid, unglamorous LAMP — this was an era when it was not formidable to be a soup-to-nuts webmaster[2]. Also, I had created blogging software by this point, but I didn’t know that’s what it was supposed to be called.[3]
If you view source of this capture, you can regale in the inlined CSS, spacer GIFs and table-based layout. I regret nothing.
May, 2002, as captured by the Wayback Machine.
A new Lyza.com “is coming soon now”…but is/was it? (And why would anyone care to wait?)
I believe the broken images visible here are an artifact of the Wayback Machine's crawling. ↩︎
The term developer didn't arise until the mid-aughts, and, boy, was I thrilled when it did. Webmaster is corny and engineer is inaccurate. ↩︎
Those of tighter scrutiny might allege that the term weblog (cheesy) and its blithering stepchild blog (eye-rolly, an unnecessary contraction) both existed by 2001. They did. But they hadn't really settled, at least not in my circles. ↩︎
If you close your eyes and picture today’s rural American small-town police chief, that’s the guy who conducted my TSA PreCheck[1] interview this week in C⸻, Vermont. Shaved head, folds at the back of the neck so that the head can fit onto the big shoulders, overloaded and outward-tilting utility belt with its medley of options both partially and entirely fatal from which he could select to suit the moment. The TSA PreCheck processing area, with its private-sector equipment, was situated across a narrow hallway from the department’s single holding cell.
Chief M⸻ asked me for my email address as we were wrapping things up. After I spelled it out L-Y-Z-A-at-L-Y-Z-A-dot-com there was a pause. This happens, more these days. I’ll admit to pride. Owning a four-letter TLD that matches my legal first name ranks alongside being left-handed in terms of identity aspects whose loss would cause me profound grief and self-confusion. Huh, he said after a moment, and asked me how this had come to pass. And I unshipped my usual vague answer: I’ve had the domain a long time, I dunno, the late nineties? I’ve been doing this web thing a long time.
With Lyza.com about to evolve, technically and philosophically, yet again, it’s a natural navel-gazy time for me to be curious about its genesis and history. To put a timeline to this little domain that has shadowed more than a quarter century of a human’s life and echoed the shape and moods of the web.
I registered Lyza.com on the 18th of May, 1997. I don’t know that because I wrote it down, or because it has enough weight with me that I remembered it on its own accord. I know because an ICANN lookup today told me so. I was 19 years old then. I know that because I am more firm on my birthdate; that is a date I know.
Not that it was my first website. That would have likely been associated with my university computer account, along with my first email address, which was — and I do remember this without struggle — [email protected]
. Then a series of accounts at local ISPs like Hevanet. These were the tilde-FTP days, basically serving straight out of your shell account’s home directory. All this before lyza.com.
If I had to put a date on it, that is, My First HTML Document, I’d wager 1995, possibly late 1994. My mother was the personal technology reporter at The Oregonian at the time. She showed me news groups, the NCSA Mosaic web browser.
The 1990s happened before universal self-promotion and the ceaseless, obsessive contributions of tiny pieces of our lives to entities that we neither fathom nor like much, and the concomitant digesting and compaction and mashing and indexing of all of those bits of humanity online. I was 19 when my domain arose. I wasn’t anywhere near parts of my life to which terms like stewardship or archival could be applied. Thus the fact that any traces remain cannot be credited to me. I wasn’t careful. And those traces are faint.
But let’s start here: Lyza.com will turn 27 on May 18, 2024. Now I know.
TSA PreCheck® is a program through United States Customs and Border Protection that "expedites traveler screening through participating TSA security checkpoints", e.g. I won't have to take my shoes off or my laptop or liquids out of my bag when going through security at participating U.S. airports. The TSA (Transportation Security Administration) "partners" with private-sector "enrollment providers" to process applicants, in my case, Idemia, "leader in biometrics and crypotgraphy." ↩︎
I made oak angels today at the Robert Frost Farm in Derry, New Hampshire, on a day trip with my friend DG. I just dropped down in the duff and scribbled my limbs for a while. The oaks are the last to relent before stick season, their leaves so gone hard copper it is difficult to credit that they don’t clank or shatter when they hit the ground.
The sun was out, all day, and this is so unprecedented of late here that the digital signs on the New Hampshire highways warned: SUN GLARE POSSIBLE. Which was true, and I flipped visors restlessly on both the outbound and return-bound drive to staunch the migraine-encouraging tattoo of tree-tree-tree-tree-tree-tree shadows over the road from my left.
I can count on no fingers how many times I’ve been to Manchester, New Hampshire, before, and I got to fix that today, with D, my friend. It, Manchester, has alleys and wide thoroughfares like it believes it’s a metropolis. In this regard, it reminds me of the misplaced urban exuberance of Pittsfield, Mass. Signs in a park said NO DOG FOULING. Cold today, everything steaming, again like bigger cities.
We looked at Charles Sheeler’s perfect take on Manchester’s mill and canal district, ambiently eerie and emotionally distant, in the Currier Museum of Art. I got very emphatic and almost shouty when I spotted it from across the gallery; “The Charleses” (Sheeler and Demuth) are a pair of my favorite American painters, who, true story, almost inspired me to get a Master’s degree in art history, with a focus on 20th century painting between the Wars.
This day was a gift for driving, and my car has the dopamine feedback package that encourages one to give it the foot. Get it in the hammer lane, stomp it, and after the briefest holding-of-breath, agony-and-ecstasy pause, it grunts and f-ing goes and it is beyond my willpower not to do that again and again. Unfortunately, the most exhilarating BLAAAWRT happens around 83 MPH, not that I’d know, of course. The exhaust system is tuned for the sensibilities of a 19-year-old. It’s genius.
But no, I was out for stars;
I would not come in.
I meant not even if asked;
And I hadn’t been.
-- Robert Frost
]]>I’m posting these photos mostly because I got in trouble for taking them in a Market Basket supermarket in Keene, New Hampshire.
Just after photographing the meat sign I was confronted by a man in a bloody butcher’s apron (such striding along the length of the meat cases!) who asked if I was an “agent” or a “vendor”, which I denied but had the unsettling guilty feeling the non-guilty get in wrong-footed surreal situations. “We have special rules about taking pictures.” I guess I nodded. I imagine the special rules are: “don’t.”
The dressing down continued for a while and I don’t think I dissuaded the guy of his hypothesis that I was stealing corporate secrets. “I just like your sign ,” I said, “it’s very...” (in the moment I couldn’t think of the word that means the opposite of disingenuous).
And, really, that Market Basket is a photographer’s dream, it’s a bonanza; moments before, I had buried myself in a display of Jeff Koontz-like, exquisitely reflective silver mylar balloons in every digit, 0-9, which they, Market Basket, in their presumably well-researched retail wisdom, stock directly across from the meat. I was tempted to buy out the whole set, because we could have done such profound things together, me and The Digit Balloons! (forgetting for a moment the fiscal irresponsibility of this idea vis-a-vis my current unemployment).
Eventually, meat-man conceded that I wasn’t technically “in trouble,” and allowed me to complete my shopping. A vivid example of that famous New England hospitality and warmth. I shop there because they reluctantly, very begrudgingly, barely let me.
Then again, I’m the one who (sometimes) drives to the Granite/Live free or die! state, (considering the state’s position on motorcycle helmets, I tend to think of it as Live free and die!) to do my grocery shopping. I suppose that part’s on me.
At least I put all three cylinders in the GR to work getting myself back to Vermont and home, enjoyed a cloudburst on the brief I-91 stretch, saw two bald eagles, one rainbow.
]]>As part and parcel of writing my book (JavaScript on Things, Manning ) and scratching the curiosity-itches of my hardware-hacking hobbies, I’ve had my hands on scads of different hardware platforms and components over the past year and more. This series of articles contain a collection of opinions and recommendations based on my experiences as they relate to two focus areas: electronics for beginners and JavaScript-controlled hardware.
This post’s electronics experience rating: INTERMEDIATE-PLUS
In a previous post, I identified two standout development boards for beginners. In this one, I highlight a couple of boards that have surprised me by becoming some of my very favorites.
My introduction to Adafruit’s family of Feather boards wasn’t auspicious: while trying to cobble together a little network of LoRA radios (maybe I’ll talk about this more in-depth sometime as a case-study of Never, Ever, Ever Give Up; maybe it’s just too embarrassing), I feverishly ordered several Feathers from Adafruit, amazed that you could buy a tidily-packaged board with microcontroller and LoRA module for ONLY TWENTY DOLLARS.
Image: Adafruit Feather dev boards on their website.
Actually, you can’t. Tripped up by similar product names, I ended up ordering these boards with only the radio module on them, thinking they were these $35 boards that actually have a freaking MCU on them, which is highly useful if you actually want to use the radio somehow.
During a protracted evening unboxing it dawned on me that the newly-arrived boards looked rather…minimal. I traveled several miles of self-hate that evening. By this point, the $35 ones with the MCU were sold the hell out, so I scrambled to find another Feather model that I could use to control the first set (it’s Feathers all the way down, ol’ chap).
This time: I ended up with some 32u4 Basic Proto boards that I wanted to stack with the brainless LoRA Feathers.
Won over I was not. The (limited) pin layout seemed a tad cruel. What’s worse, only four pins have interrupt support and those pins are also the only pins that support TX, RX, SDA and SCL, a rather project-ending bummer when you’re building something that involves serial logging of I^2C sensor readings. Also awesome when I’d already soldered fly wires on the radio-only modules in a 32u4-specific fashion. Three cheers for desoldering wick and chagrin.
A few table-flips later I ultimately netted a small stack of the Feather M0 Proto boards. Now a few hundred bucks down the Feather rabbit hole, I was steeled for rueful-ironic laughter. And, yet. What is this? These boards fundamentally do not suck. In fact, they’re pretty great!
Image: Adafruit Feather M0 Basic Proto. Source: Adafruit website.
There are some features common to many (if not all) Feathers that I didn’t get the joy of experiencing in my initial raging failfires but came to enjoy them once I had my M0s in hand.
I like the boards’ power options. You can plug ‘em into USB power, natch, but you can also attach a LiPo battery to the already-attached-and-waiting-for-yah JST connector and it just works. It’ll charge when on USB power, automatically prefer USB power when plugged in, then automatically switch over to LiPo power when disconnected. This is all kinds of convenient and awesome.
The consistency and commitment to the Feather form factor is handy if you want to stack or otherwise mix boards. There are a number of header arrangements you can choose between. I’d love to see a project that just keeps stacking Feathers on Feathers on Feathers…
Also, minor detail but one close to my heart: the pinout diagrams for the Feathers are clear and well-designed. Just lovely.
Image: The M0 pinout diagram is clear and easy to read. Go see it full size. Credit: Adafruit.
I’m going to keep calling the Feather’s MCU the M0, because its actual naming flourishes surpass racehorses/members of the Hapsburg royal dynasty (feast on it: “Atmel® | SMART SAM D ARM® Cortex®-M0+ based microcontroller (MCU) “).
The M0 is a great little friend. Tons of pins, interrupts everywhere, PWM like it’s going out of style, and a not-too-hard-to-mess-with SERCOM for adding/munging serial interfaces (want all six serial modules to be SPI? You can probably do that). Quick compared to the ATmega32u4, with more Flash, more megahertzles, yet still low-power.
Coupled with the little Feather form factor it makes a pleasing platform. Not bad to work with to do Arduino-compatible stuff, once the finicky little bits of IDE/board support are done.
The only un-work-aroundable thing I ran into was a bug in the M0-specific Arduino Wire library (probably?) that made it impossible to use these boards as I^2C slaves (as master, the more typical setup anyway, they’re fine). One day maybe I’ll go check on the status of that bug.
Anyway, I like this board. Which is funny because I had to go through much pain to find it.
]]>From time to time, we celebrate certain conference videos—usually the kind that show a speaker and some slides in a humorous or educational harmony. But I've also started noticing how high quality some of the intra-conference production can be, like the short video intros that are played between sessions. I'd like to share two from the past year that I thought were too good to be relegated to the cobwebby annals of forgotten conference collateral.
In Marc Costa's intro video for Smashing Conf NYC 2017, different materials are laser-cut, twisted, bounced and folded into representations of speaker names, making strong use of the third dimension. It's an artful melding of the technical and the tangible. While each individual composition is handsome, it's the entire collection together that feels sublime.
A still from Marc Costa's Smashing Conf 2017 intro video. Watch it on Vimeo
CreativeBloq's intro video for it's 2016 San Francisco Generate conference is all digital, but the animation is inventive, dimensional and kinetic. And I happen to be a fan of the energy of the music.
It is awfully fun watching my own name unfurl! Watch it on YouTube
]]>As part and parcel of writing my book (JavaScript on Things, Manning ) and scratching the curiosity-itches of my hardware-hacking hobbies, I’ve had my hands on scads of different hardware platforms and components over the past year and more. This series of articles contain a collection of opinions and recommendations based on my experiences as they relate to two focus areas: electronics for beginners and JavaScript-controlled hardware.
This post’s electronics experience rating: ABSOLUTE BEGINNER
A question I field often is: “What kind of board should I buy to learn how to do this stuff? Where do I start, hardware-wise?” And there are two boards I tend to recommend, in the end—which is why they figure so prominently in my book (Don’t worry, I’ve provided some other educational links for each in case my shameless plug is…shameless).
Let’s not go off half-cocked with a major undefined or vague term. What’s a development board, anyway? In my book I use the following definition:
Development boards, also called prototyping boards or just boards, are physical development platforms that combine a microcontroller or other processing component with useful supporting features. They are the bread and butter of the hardware-hacking lifestyle. Boards range in cost from just a few bucks to over $100 for high-end [Single-Board Computers, a.k.a.] SBCs.
Boards are centered around their brain, a combination of processor, memory and I/O. 8- or 16-bit microcontrollers are at the center of straightforward, entry-level prototyping boards like (most) Arduinos. Boards with more sophisticated, 32-bit microcontrollers may be able to run embedded JavaScript.
When setting off on a learn-electronics journey, choosing a board is a solid first step. It’ll determine the trajectory of your explorations.
Photo: A selection of development boards. Clockwise from top left: Texas Instruments Launch Pad, Arduino Uno R3, Adafruit Trinket 5V, Particle Photon
Features that make a development board ideal for beginners include clarity, constraints and ubiquity. Put another way, good dev boards for n00bs have simple, obvious features and are used by a whole crapload of folks.
A good beginner dev board should support a reasonable set of I/O features (note: this is usually dictated by what features the board’s microcontroller can support). Different pins on the board support different features, which should include—at the least—digital I/O, analog input, PWM (pulse-width modulation), ADC (analog-to-digital conversion) and serial support (including I2C and SPI). Don’t worry a tick if you haven’t heard of any or all of those things yet.
Pins should be sensibly laid out and named (or, often, numbered). Many boards have pin numbering and capability information helpfully silkscreened onto the board itself. A board with missing key features, nonsensical pin layout or a confusing pinout diagram can make you wish you hadn’t bothered.
Photo: Detail of the Arduino Uno (R3) board showing silkscreened pin numbers.
The more things you can do with a dev board, the more distracted you can get. While a juiced-up microcontroller, or more flash memory for your programs, or additional peripherals and goodies sound tempting, they’re also potentially pits to wallow in and get overwhelmed.
Limits to the oomph of your first dev board can paradoxically be freeing, allowing you to zero in on the fundamentals of electronics hackery without suffering from the angst of overchoice.
For example, despite their popularity, I’d recommend against the Raspberry Pi family for introductory electronics explorations. So much is possible with the Pi platform that it can be distracting and overwhelming if you want to focus on electronics fundamentals for a bit.
Also, Pis fall short in feature complement: while they certainly have some on-board I/O capabilities, their convoluted pin numbering, limited PWM (pulse-width modulation) support and complete lack of ADC (analog-to-digital conversion) makes introductory I/O hackery potentially confusing and frustrating.
An aside, though: If you want to tinker around with projects that make use of the Pi’s general-computing powers—for example, cobbling together a tablet-like PC for your kitchen—by all means, do! They are fantastic tiny computers, just not the most ideal beginner electronics platforms.The newer Pi 3 is especially fantastic, by the way.
When you’re choosing your first dev board, you’re focused on learning how to do things with electronics. Getting sidetracked by a platform’s idiosyncrasies (or, even, bugs) is not only annoying, it can be derailing. At this point in your journey, it shouldn’t be your job to figure out whether a component or project is misbehaving because of some arcane wonkiness in the hardware (or supporting firmware) itself. Working on a well-tested platform is helpful here: when an LED fails to blink, it’s more than likely pilot error; you can factor out most (but never all, truth be told) random flakiness when trying to figure out what’s up.
And when you do get stuck or confused, it’s awfully nice to have a big corpus of support to turn to.
For both of these—stability and support—an underlying core factor is the platform’s popularity, its relative ubiquity. For beginners, choosing a platform that is widely used and has been around for a while can be a sanity-saving tactic.
At the end of the day, there are certain dev boards that tick most of the boxes time and time again, for each of two approaches (and you choose your own adventure here!):
npm
ecosystem of Node.js available to you. This will require a more sophisticated microcontroller/processor than those needed for the first path.Ohmigod, how boring! you might be lamenting. I’ve just held up the non-cutting-edgiest, pedestrian, and basically first open-source board as a shining example of newbie awesome.
Illustration: A bad artist's impression of the Arduino Uno (R3) board.
Yes, it’s true, the Uno has been around since a long time, and, also true, is seriously not flashy, but it really is a platform of the people. Its relative stodginess is a blessing when compared to the immaturity and fragility of many boards. Sure, working with one won’t give you the smug frisson of novelty, but it will, in most cases, actually work. The Uno does what it says it does. If it didn’t, there are a lot of people who’d be irked.
Statistics are spotty at best, but even back in 2013, 700,000 official Arduino boards had been “registered”, with an estimated 700,000 additional “derivative or clone board[s]”, that is, about a one-to-one ratio (source).
Let’s talk about a couple of things here. First: Arduino is a family of boards, and the 700k number references Arduino boards of all stripes, not specifically the Uno (though the Uno is the most popular, and, no, I don’t have a good source for that but it certainly seems to be the damned case). At the same time, it’s been several years so surely there are many more. All right, that’s out of the way.
The “derivative or clone board” notion arises because while Arduino is open-source, “official” ones carry the Arduino logo and can only be made by certain blessed manufacturers. Non-official ones are fine, too, but there is a bit of a thing with knock-offs that have the logo but shouldn’t, so-called “counterfeit” Arduinos, which are cheaper to buy but potentially shoddy.
Here’s how the Arduino Uno R3 stacks up:
Expect to pay about US$25 for a certified Arduino Uno R3, considerably cheaper if you’re willing to take a risk on reliability, which, at this stage in the game, I wouldn’t?
The Uno: unglamorous, reliable, and the sui generis for educational hardware hacking. There is really no substitute.
OK, so there is this fantastic book—brilliant visual design, clear instructions, just in all ways excellent. It’s called Arduino Projects Book. But there’s a pretty major downside: you can only get it if you purchase the (official) Arduino Starter Kit. That’s a bit of a bummer, but I will say: that starter kit rocks. It’s worth getting if you can spare the extra funds (they’re currently retailing at $85). Of course, there are tons of other getting-started books on the market if you Google around a bit.
LadyAda’s tutorials are a free resource, and there are a handful of books for beginners called out on the Arduino website.
And now for something completely different, and lightly violating what I said about constraints and ubiquity: the Tessel 2 board. The Tessel runs an optimized-for-embedded Linux distribution, OpenWRT, and has enough oomph to run Node.js on-board. So, if you’ve selected the “Web-Connected Devices” adventure, you can npm
to your heart’s content and do some pretty complex, nifty software things in short order.
Like the Arduino, Tessel is an open-source platform. Also like the Arduino Uno, the Tessel’s pins are sanely organized and labeled and the board supports all the major features you’d expect. It’s a more sophisticated device than the Uno, with a separate co-processor for handling I/O. There are some frills. You’ll likely put the on-board WiFi to use straightaway. There are two USB peripheral ports, which are fun to play with but I’d recommend keeping it simple to start and ignoring those initially.
For controlling the Tessel 2, I recommend Johnny-Five, an open-source, widely used, cross-platform-compatible JavaScript framework.
Here’s how the Tessel 2 Stacks Up:
A lot of bells and whistles don’t necessarily make this a complicated platform. The workflow is smooth and feels natural for Node.js/JavaScript programmers. Johnny-Five has intuitive, built-in support for tons of components.
SparkFun sells a complete Johnny-Five Inventor’s Kit, which includes a Tessel and a bunch of components (disclaimer: I was involved in writing the experiment guide. Even if you don’t buy the kit, that guide is pretty exhaustive (tons of great effort from several committed people!).
]]>After I gave my talk, A Pragmatist's Guide to Service Workers at Smashing Conf NYC, several folks gave me feedback that they liked the metaphor that I used in a tangent about JavaScript Promises, which you need to understand to be able to understand the Service Worker API.
Here is just that section of the talk. The allusion to the Fetch API is because of its importance to Service Worker. Enjoy? Enjoy!
]]>The following is an excerpt from chapter 3 of JavaScript on Things: Hardware for Web Developers by me, Lyza Danger Gardner (I also take the blame for the illustrations). I hope you enjoy it! The book is in early release (MEAP) from Manning Publications. It's intended as a soup-to-nuts get-started guide to electronics hacking for JS-familiar web developers.
Designing and building circuits may be completely new to you, and may seem intimidating. The good news is that there are just a handful of core concepts to wrap your head around. Once you understand the interplay of voltage, current and resistance—as formalized in Ohm's Law—you're well on your way to being able to understand basic circuits.
There are a couple of metaphors traditionally used to illustrate voltage, current and resistance. The most common analogy is a hydraulic (water) system involving tanks and pipes. Effective, but not always memorable. Let's try a different adventure.
High in the mountains, deep in the forest of some place that does not exist, a tribe of gnomes found themselves inexplicably in possession of an infinite supply of jellyfish. And the gnomes being ornery and mischievous, they struck out to find a humorous use for the otherwise-inert creatures. They found great fun in dropping jellyfish over cliffs, watching them splash into the lake below or bounce off the roofs of local villages.
The nearby townspeople were initially inconvenienced but soon recognized that the plummeting invertebrates carried energy and could be a free source of power for their cookie factories—but only if the onslaught could be harnessed safely. So they observed, and, over time, came to understand and manipulate the core factors of electrical circuits: voltage, current and resistance.
Townspeople noticed quickly, for example, that the higher and steeper the cliff, the more energy tossed jellyfish have when they reach the lake on the valley floor. Lesser drop-offs don't provide as much potential energy for hijinks when the jellyfish splash down.
Higher cliffs provide more "voltage", that is, electrical potential. Voltage is like electrical "pressure", pushing the charges (jellyfish) from high potential energy toward a location of lower potential.
Voltage is a measurement of the difference of potential energy between two points. It's something like pressure or tension or gravitational force, as electricity is always itching to move from higher voltage to lower voltage. Voltage, measured in volts, is potential energy, however—voltage alone without moving charged electrons (jellyfish) can't wreak any havoc.
For something interesting to happen, jellyfish need to get actively chucked over the edge of the cliff, a task which the gnomes are more than happy to perform.
The townspeople learned to measure jellyfish current by staking out a spot on the cliff and precisely counting the number of jellyfish that passed by over a precise period of time. Current, the flow of electric charge, is measured in Amperes, often abbreviated as Amps.
Current, the flow of electricity, can be measured by counting how many charges (jellyfish) pass a specific spot on a cliff during a defined period of time.
The townspeople needed to find a way to manage the current of jellyfish so that it wouldn't overwhelm the delicate cookie presses and ovens. This is the lynchpin of jellyfish circuit control: resistance. Resistance is how much a material is able to resist electrical flow. It is measured in Ohms.
They engineered jellyfish-channeling systems into the cliff faces, restricting their flow to a more reasonable level. For circuits near the higher cliffs (more voltage), these systems had to be more robust because of the immense jellyfish-falling pressure from above.
Townspeople add resistance to the circuit by channeling falling jellyfish through a series of tubes. Increasing resistance lowers the current.
A summary of the townspeople's discoveries is shown here:
Voltage
Current
Resistance
In the end, the townspeople perfected the circuit and the jellyfish helped to make some of the best cookies around.
There is a power source—troops of gnomes—tossing jellyfish over a cliff. The higher the cliff, the more voltage (potential energy) is supplied to the circuit. The current (flow) of jellyfish heads toward the factory machinery.
To reduce the jellyfish current to manageable levels, channeling systems and pipes add resistance.
Once the jellyfish have given power to the cookie-making machinery and reached the floor of the factory, they reach the point of lowest potential in the circuit. Jet-pack-wearing gnomes act like a pump of sorts, hoisting the weary jellyfish back up the cliff where they can be thrown over again. And again and again...
Voltage, current and resistance are vital concepts of basic circuitry. The next step is to understand how these factors relate to each other, and how they apply to real-world circuits.
Voltage, current and resistance are related to each other in consistent ways. Each of the three factors is like a lever: tweak one and you will affect the others. These interplays became so central to the town's populace that the factories started producing cookies that illustrated the relationships.
The townspeople's new signature cookie showed the relationship between voltage (V), current (I) and resistance (R).
The bearer of the cookie can bite off the factor she wishes to determine—then see how it can be derived from the other two factors. For example, if she wanted to determine resistance (R), she could bite that off and see that R = voltage (V) divided by current (I).
Georg Ohm figured out these key relationships between voltage, resistance and current back in the 1820s, well before the clever cookie-townspeople, which is why Ohm's Law bears his name. If you prefer your math in non-cookie form, the relevant equations are:
V = I x R (Voltage equals current times resistance)
I = V / R (Current equals voltage divided by resistance)
R = V / I (Resistance equals voltage divided by current)
OK, you might be thinking, but how do I apply this in the real world? ... Well, I hope you might give JavaScript on Things: Hardware for Web Developers a read and find out!
]]>Up until a few months ago, I lived in a space-constrained duplex in a happening urban neighborhood in happening Portland, Oregon. Now I live in the middle of a mountainous forest in Vermont, on a dirt road, next to a river. I changed jobs. I got rid of about 75% of what I owned. On the face of it, everything has changed. Or has it?
Here are some thoughts on all of the changes, as well as some thoughts on how much hasn't changed at all. In the form of a FAQ, because: why not?
In a town in southern Vermont. When I say town, I mean it in the New England sense: a square-ish area of land roughly 6 to 10 miles on a side. Not all towns have villages or any form of conurbation; ours does but it is about eight miles distant.
We live on a dirt road that is well-maintained but is, still, dirt. Our nearest neighbors are across the river, about 800 feet as the crow flies but a quarter-mile walk to get there. They're the only neighbors within screaming distance.
There's a general store about six-and-a-half minutes away by car. The two local villages are each about a 12-minute drive. One has a grocery store I'd roughly equate to Safeway on the west coast: it's sufficient but not inspiring.
It's remarkable and rewarding how much of our food is obtained directly from the humans who made it. We belong to a little CSA up the road—it's personal enough that they'll notice and comment on our absence if they don't see us for a week or two. There's a Jersey dairy—raw milk and eggs—down the road. Pork and chicken often comes from another farm run by a couple in the next town over. And one of our neighbors raises grass-fed lamb (and may—CROSS FINGERS—scale into acorn-fed pork soon).
Tons of local relief. Our house is at the bottom of a valley. We are surrounded by forest, save for a small field on the east end of our property which is used for hay for our neighbor's sheep. I usually say that, aside from that field, you can tell when you're on our property because you won't be able to stand up properly—it's that steep, most of it.
There's a river. Technically it's the North Branch of the Williams River, but no one ever calls it that. It's just the river. It's a small river; you can walk across it easily, a large creek, really. The water is clear and the rocks various and interesting.
No, not entirely. So, no. We do have a grid-tied pole-mounted 3kW solar array, but it doesn't power the whole house. We're also connected to civilization through our phone/Internet lines (more on that shortly). Aside from that, though, yes: we have a well, septic. Our heating fuel is provided by an in-ground propane tank.
Is that a question or a statement? Here's the best part of all the parts, when taken as part of the beautiful whole: our Internet is fantastic. The local telecom got a grant from the federal government and used it to invest in a fiber grid. We pay a reasonable amount and we get reliable 1G symmetric connectivity (real-world performance between 600-800Mbps). Yee haw!
This has been the most surprising thing about this change. I expected to have loneliness, to have to work intensely to have contact with other humans. And yet. We know all of our neighbors. Our front yard and porch are visible from the road and sometimes people just drop by if they see us. Our friends and family visit us.
Sometimes we won't leave our property for four or five days at a stretch and it isn't bothersome at all.
Yep! I'm lucky to have my position as an Open Web Engineer at Bocoup. Their systems and people are well-oriented for distributed teams. We're fortunate to have a real, dedicated office portion of our house: a separate staircase, large shared office room and a half-bathroom. So "going to work" feels like a thing, still.
Nope.
Really. But we'll see what happens once winter comes!
]]>2016 is my own year of combining JavaScript with hardware in the real world, including a large project that I'm almost ready to talk about, but not quite. When I talk to web developers about hacking on hardware, often the first question I hear is: but how does it work? How do JavaScript and hardware work together?
There are several technical ways JavaScript can be used to control physical objects. Most of the objects I'm talking about loosely here can be categorized as embedded systems. Embedded system is a term that sounds a bit more complex than it need be—in (blog-post-informal) essence they're embedded because they tend to be hidden away inside of something, like a thermostat or a microwave or a homebrew 3D-printed project enclosure; systems because they combine a tiny computer with various connections and power to create, well, a system.
The "tiny computer" involved is typically a microcontroller, which combine processor, memory and I/O capabilities are combined in a single package (that is, a chip). Microcontrollers tend to be simple creatures, 8- or 16-bit processors with limited memory (on the order of tens of kilobytes, often) that require very little power to do their thing. They're cheap, reliable and ubiquitous.
Microcontrollers are small, with absolutely itty-bitty connection pins. For novices and hackers and prototypers, development boards make it easier to work with microcontrollers by providing human-sized ways to connect to the I/O pins of the microcontroller. The boards also provide a steady way to power the microcontroller as well as several supporting features like timing chips, connections for different communication protocols, easier methods to get programs onto the microcontroller, etc.
Developer boards like the omnipresent Arduino Uno give easy access to I/O connections and take some headache out of working with microcontrollers.
Microcontrollers are getting cheaper and faster. Newer, 32-bit MCUs (that's an abbreviation for microcontroller), especially some of the ARM Cortex-based ones, are capable of running actual native JavaScript, or something damn close to it. It has become feasible to run embedded JavaScript. But we'll put that aside for another time.
Aside from embedded JavaScript, another method for controlling hardware with JavaScript is a host-client setup. Recall that microcontrollers can be quite constrained—8- or 16- bit processors, very limited space for programs. They're not up to running an operating system or Node.js or executing JavaScript on their own. Instead, the host-client configuration converts the hardware (microcontroller-based board) into a client which does the bidding of JavaScript that is executed on a host (e.g. your laptop).
A combination that works consistently for me is uploading (flashing) the firmata protocol to an Arduino-compatible board—this might sound mysterious but is as easy as uploading a pre-packaged script to the board from the free and cross-platform Arduino IDE. Then I write scripts using Johnny-five
and execute them with node
.
Johnny-five provides a high-level API (with classes like Led
, Compass
, Piezo
, etc.) that can feel more comfortable to higher-level programmers than microcontroller code written in C or whatnot. Getting started is pretty easy!
I recently gave a presentation about this at the March meeting of Portland's JavaScript on Things Meetup.
To learn more about how prepare a board as a client and some Johnny-five basics, you can read my (warning: pretty informal!) slides. Or jump straight to looking at some examples of scripts and wiring schematics, like this "pointing-north alarm":
I was also just on Hanselminutes talking about the host-client method of controlling hardware with JavaScript.
]]>It has been a banner year so far, 2016. I've been included in some frankly amazing things: events, publications, conversations. There are things coming up this year that I'm just bursting to tell you all about (I will as soon as I can!). At times I stop to reflect that I just cannot believe how fortunate I am to have so many opportunities.
But as happens every so often, I'm not so sure I deserve this. I want to believe I'm invited to speak because I'm becoming reasonably good at it, that magazines want to hear what I have to say because what I have to say is cogent. Sometimes I believe that to be the case.
But at other times, I worry. Like today. I got invited to speak at an event far away in an interesting place, and, man, that would be awesome except that by some coincidence I am already slated to be at an event far away in an interesting place at the same time, so I had to decline. Instead, I passed along the name of a talented male colleague as a possible alternative.
The response:
Thanks...though I'm specifically looking to get more ladies on my lineup.
OK. I get it. Organizers want to have diverse lineups. But spelling it out to me like this makes me feel wretched. Is my gender my only qualification? Would I have made the cut if I weren't a woman? Am I really qualified to be doing anything that I'm doing? Do I bring anything useful to the community?
I know that the sentiment behind things like this is a desire to bring more voices to the fore, to give changes to under-represented folks. But it hobbles me with self-doubt.
]]>Oh, the year's almost over! I went to some conferences. I spoke at a few conferences. Cloud Four held our first-ever event, Responsive Field Day, in September. What follows are my own editorial opinions about the good, the bad and the "whatever, enh" cobbled together from my three types of involvement (attendee, speaker, organizer) in conferences about the web.
I've seen a rash of session times ranging from 15-30 minutes instead of 45-60. There are some self-evident pitfalls here—schedule management, content whiplash, inability to dive deep—but for many conferences, this works fantastically for me from all three (attendee, speaker, organizer) angles.
As a short-attention-span attendee, my interest is held well. Speaker energy tends to be more constant, and more focused. This leads to more interesting, denser presentations and allows for more topics to be included in a single event.
As a speaker, I adore shorter sessions. Please. More. I'm able to maintain intensity throughout without getting tired, and it forces me to focus my thesis. It also allows me to be more critical-precise in timing because I'm working on a tighter scale. The majority of my most-successful presentations have been delivered in shorter time slots.
And for organizers, it allows the ability to include more content and more diversity of voices in a single event.
The role of Twitter is changing. For many it remains indispensable at conferences. But I've also attended a few conferences this year from which the Twitter traffic was virtually nil.
My attitude puts me somewhere in the middle. As a conference attendee, monitoring the feed from the hashtag for the event can be useful. As a non-attendee (i.e. my other contacts tweeting about conferences I'm not at), conference-related traffic is effectively useless. Twitter, as always, seems to be a matter of opinion.
2015 is the year of everyone falling in love with Slack. It happened to us at Cloud Four, too (we've been using Slack for quite a while now, for kind of everything). Slack's spike in popularity has led to its use as the backchannel of choice at several conferences I attended this year. On the one hand, it's an efficient and engaging way to communicate. On the other hand, it can serve as an inducement to futz around on devices during sessions and a disconnect or distraction. The jury's out on this one for me.
As I recall my year of conferences, what floats into my head is happiness. Whenever I speak at an event, I feel honored to have been invited, and hope that I am living up to the high expectations of attendees and organizers. I've attended events with crackerjack content and energy. And now I've been fortunate enough to come full circle and see what it's like from the organizers' side.
Curious about other conference-goers' experiences this year!
]]>The morning keynotes at Node Interactive just happened and I'm reeling a bit and talking to interesting people but wanted to write something down before I forget.
Node.js versions are cray-zee for me right now.
For some reason, it was calming when James Snell mentioned that the late-2014 Node.js/io split freaked everyone out and my desire to stick my fingers in my ears and pretend it didn't happen echoed how most of the community felt, too. And so we are all glad to see the convergence, and now the best of both communities combined:
And a whole lot less weird tension about the Node.js vs. io.js fiasco.
Cool. By now, most of us in the node community have seen the release schedule for Node.js. The launch into that was fast and heavy: 4.0.0
in September, 5.0.0
in October (at time of writing we're at 5.1.1
for stable). This had led to a sense of rushing to be writing new code on the latest, but hang on.
4.x
and 5.x
are both stable, but they're not the same: 4.x
will go into active LTS (long-term support) in six months, and will stay in LTS for 18 months after that. 5.x
is stable, but will never got into LTS. That is, stable-with-LTS releases are yearly—and the releases you should lean toward for long-term, production work.
Yes, we are moving quickly as a community. Also in Snell's keynote was the statistic: about 38% of folks are at 4.0.0
or better. That's fast.
At the same time, there is a lot of legacy code out there. Tom Croucher, in his keynote on Node.js at Uber, mentioned a currently-open issue in Uber's codebase to upgrade something that is still on 0.8
. If most of your stuff is on 0.10
or 0.12
, you are not alone. Just note: active maintenance for 0.12
ends about a year from now; 0.10
earlier (Octoberish). You have some time but not forever.
There are an assortment of ways for managing node versions from within a project, but a universal pattern hasn't quite coalesced.
Dropping a .node-version
in a project root is a tactic I've seen on occasion, but its inclusion in version-controlled code can be bothersome and it's not a pattern I see terribly often. Other flavors of this dotfile-controlled versioning can be seen in .nvmrc
or .npmrc
form.
The definition of an engines
property in package.json
is ubiquitous, but it's only advisory in current npm
versions—you'll get a WARN
, but a package made for 4.0
+ will happily install on your local version of 0.12
(and then break when you try to use it later).
I'm sure I missed things, but, again, this seems scattershot to me at the moment.
Maybe version management has already been roundly sorted and I'm an ignoramus for not being on top of it. But it's harshing me this week. Although I haven't blogged in a month, a hell of a lot has been going on with my site under the covers. More on that later, maybe, but.
Yesterday I tried to deploy a bunch of site infrastructure updates to lyza.com's new host, netlify. Call me boneheaded for not thinking more carefully about dropping in ES6 enhancements supported by >=4.0.0
but it wasn't until after I'd deployed and the build failed—netlify won't deploy broken builds, fortunately; yay for that—that I realized I didn't even know what version netlify was using to build my site. Turns out 0.12.2
(I'll gloss over it, but figuring that out was not as straightforward as you might imagine). Cue a few furious patch commits to back out of my ES6 syntax and the builds succeeded again.
Netlify support via gitter was quick and responsive, and they explained I could use an .nvmrc
keyed to any of the available versions in their open-source Docker build image, but—and I think this is a common theme—it's not documented anywhere and it took a bit of breaking to figure it out.
I think the next few months are going to involve a bit of scrambling as the community works through some of this stuff. Now to go re-apply that ES6 stuff and get my site up to 4.0
+. See yah!
I just made some changes to this site. As I do, maybe once per fortnight, and, I swear, I try not to talk about it too much because nerd nerd nerd boring minutiae nerd so boring.
But I got to thinking. My development process for this site involves a party of one—me—and I can't imagine anyone giving a blinking momentary crap about my codebase. It's not solving a general problem. It's useful to exactly one human, whose name starts with L and ends with yza.
So why do I issue pull requests and carefully amend commits to keep my git history tidy? Why do I go to lengths to write README
s that no one will (need to) read (ever, for any conceivable reason)? Why create a public repository for an unpublished yeoman generator that no one could ever possibly benefit from? Not only that, but I wrote unit tests.
Am I being pedantic? Do I have a glimmer of hope that someone will somehow benefit from my dusty trail of code? Is this what modern technical vanity looks like? Is it like the mandala in the Alvord Desert in 1990, placed there just for the hell of it, maybe with the expectation someone would eventually notice it?
(p.s. That mandala, despite title to the contrary on the above linked, is not in any way unexplained. An artist came forward at the time and explained how he did it. I have a former colleague who covered this story for local media. It's intriguing and on-theme how only the story about it being supernatural or extra-terrestrial survives now in the longer-term internet annals. I guess that's a more interesting take.)
I have just asked many questions. Asking them to myself, the way I assign myself to my own PRs. However dingbat and talking-to-myself this all may seem, the process just feels right. GIT WORKFLOW FOREVER.
]]>The past month has been blurry, due, in part, to some health fun I'll gloss over here. But before I'm off again (it's travel season), I wanted to jot down a few highlights.
For Cloud Four staff, Responsive Field Day was a big ol' deal, one of the bigger things about our 2015 so far. Everything I have to say about the quality of the talks, the community feel of the event, and the contributions of Cloud Four-ians is so laudatory that it doesn't leave much else in my brain. Maybe that's part of being so close to the planning and execution of it. Maybe I can't see the proverbial forest for the trees. But, as well, all of the feedback I've received has been emphatically positive. So, I don't know. It seemed, just...damned good.
It was the first event Cloud Four has put on, and the first event for which I was deeply involved in planning. I enjoyed the hell out of it.
You can find videos and podcast-ed audio of everything from the day, talks and panels, on the event's web site.
Photo copyright 2015 Win Goodbody
My column this month for A List Apart struck a nerve! On the good side, it's kicked up some good conversation about how to get things done to make offline-first a reality, now. For example, David Walsh from Surge explains a tool for managinge App Cache more simply.
The Web is complex and fun and I hope we keep making it the best we can.
A habit I've had since I was about ten is poring over maps the way someone else might read novels. I was in Wales a few years ago, whiling away a country evening flipping through a British driving atlas when I started to obsess over how wonderful British village names can be. I'm going to save diving into this until later, because it absolutely deserves its own post, but the esteemed Chris Higgins used an early spreadsheet I made of some of the choice finds to inspire a recent article on Mental Floss.
Here's proof that I can Pinterest with the rest of 'em. I got bored with the scraped-up crappy melamine top of the rolling coffee table in the living room (Ikea, and how).
Sanding, priming, and gold spray paint.
There's a lot of tasty meat to Firt's post about iOS 9 Safari tech deets and, correspondingly, there's been a lot of chatter around the intertubes today. Interesting stuff, from CSS scroll snapping to broader ES6 takeup. Kudos to Max for pulling all of this stuff together—so much info!
But aren't we all just a-quiver about the newly-unveiled "Split View" in Safari:
On newer iPads, you can upgrade Side View to Split View where two apps share the screen working simultaneously.
There are a bunch of geometric flavors of this: 1/3 of the screen, 1/2, 2/3. This has major implications for responsive design!, people shout.
Except it doesn't, in my book.
There's been a voyeuristic flocking to @firt's (quite well-done) dimension porn image:
(Thanks to @firt for permission to use this image from his original post)
And then we fall into a trap: what media queries do I need to target all of those variants? What are the pixel dimensions of each viewport possibility? This changes everything! So much more work now!
Let's reassess. Please.
This situation highlights the import of designing along a continuum, not along rigid particular breakpoints (with, what, windy, desolate, rocky, broken wilderness between them?). Let the content flow like water from a mobile- or content-first baseline design into the spaces of the viewport as it morphs. This sounds a little bit woo-woo, but I swear, in general, it works.
Thinking about proportions and shapes is fine. But when we try to canonize a set of specific pixel-based media queries (ever-expanding as viewport possibilities flourish) we're inadvertently committing ourselves to a kind of arms race that hearkens back to User-Agent
sniffing.
I don't mean to understate the complexity of web design in our current device reality—yeah, it's challenging. But it doesn't have to be hard in this way. The number of adjustments I anticipate having to make to my site to make it look good in these new dimensions? Approximately zero. I've already tried to account for the continuum. (Disclaimer: I'm no designer. My version of "looks okay" across viewport sizes may not exactly be world-class.).
If there's any fragmentation here with respect to responsive web design, we're doing it to ourselves. Take a deep breath and let yourself free from the pixel bounds of each viewport out there...isn't that a little better?
]]>The recent results for WebAIM's survey study of screen reader users reminded me of two things:
Paraphrased: Web accessibility is a complex subject, but using ARIA landmark roles alone can give us a good start.
WAI-ARIA (Web Accessibility Initiative's Accessible Rich Internet Applications Suite, whew) "provides a framework for adding attributes to identify features for user interaction, how they relate to each other, and their current state." I'm not going to delve into all of what it encompasses because that is a lot.
Let's just talk about roles for now. Using the role
attribute in an HTML element is a way of explicitly declaring what it does in the HTML document. What its purpose is. What it means in this great universe of web stuff. Like this:
<main role="main">
<nav role="navigation"></nav>
</main>
There are a lot of defined roles in ARIA, meant for accomplishing different things. Let's focus in further.
There are four categories of role:
There are only eight (8) landmark roles to master. Fewer, really, because not all are applicable to all things:
footer
element in applied usage.<form role="form">
to be redundant.I am of the opinion that you can get excellent bang for the accessibility-improvement buck by simply using the following four ARIA landmark roles in every HTML document you produce:
<header>
or other element that wraps the masthead or main banner of the page. Done.main
role was poorly assigned on my site's templates. It'll be fixed by the time you see this.With the exception of navigation
, each of these roles should appear only once per document.
Go grab the tota11y toolkit and quickly check any page. It has a section for Landmarks
. Easy peasy!
p.s.: This isn't the first I've spoken of this.
]]>It's an awkward time of the year. People with little ones are pushing them back toward school buses. The weather is doing something transitional. Vim held strong when entering the season faded into a mild flatness by the end of it.
July and August were cerebral for me. I managed to ship my video about life and springtime, I read a dozen or so books, and, ultimately, coded this site from scratch, but I'm left feeling as if I have little of interest to show to anyone outside of my own head.
It's time for something new, something forward. I've been on a hobby moratorium for a few years, out of guilt for my languishing pre-existing ones, but maybe it's time to shrug off that mantle and get excited about something new. Any ideas?
]]>Last weekend, I went to Alaska. I don't know why it took me so long—it's been almost five years since I went to my last new state (Hawaii). In fact, Alaska makes a cool 50—I've been to 'em all now.
Turnagain Arm is a dead-end, long finger of sea extending eastward from Anchorage off of Cook Inlet. We drove around it to get to the Kenai Peninsula.
Cook Inlet—including Turnagain Arm—has a population of Beluga whales. Most of the world's population of Belugas is more arctic, and the Cook Inlet Belugas are genetically isolated. DNA suggests no inter-breeding with other Belugas for something like 6000 years. Unfortunately, about half of the Cook Inlet Belugas died off mysteriously in the 1990s and the herd (pod?) is still struggling to slowly re-establish itself. So I feel doubly lucky that I was able to spot a few, just white humps, out in the water on this trip. Who knows how long they'll be around? (Source for this: various interpretive signs at Beluga Point and Bird Point along the Seward Highway. I have a generally-reliable memory, but this is non-robust, source-wise).
Belugas are one of my favorite whales (perhaps only beaten out by narwhals, which are amazing). They have this lovely demeanor and they look like they'd be good friends. I learned by watching Wikipedia that they are sometimes called Sea Canaries because of their "high-pitched twitter". And also, through associated Wiki-drift, that echolocation and such elaborate chatter are restricted to whales that have teeth (Odontoceti).
We also saw a sea otter in the harbor at Seward. I've been reading James Michener's Alaska, which has a goodly amount to say about sea otters, so I wasn't taken by surprise by how big she or he was. He or she was big! And relaxed, swimming on his/her back and grooming and pausing to let us take photos and video. Did you know that sea otters have the densest fur of any animal? (Both roadsign park signs and Wikipedia agree on that so it must be true.)
Late August is spawning season for silver salmon on the Kenai peninsula, and we timidly peered into solid creeks of flipping, disintegrating fish, nervously scanning around for salmon-keen bears.
We were ill-timed (being mid-moon-cycle) and so weren't able to see the legendary bore tide in Turnagain Arm, but I, fascinated with tides in general, console myself by the claim that (from highway signs!) the bore tide hasn't been nearly as amazing since the 1964 Good Friday earthquake dropped the entire arm's seafloor by about six feet. The magnitude-9.3 quake put a number of communities (Portage, Girdwood, Hope) partially or completely under flood. They moved Girdwood up the mountain a couple of miles. They gave up on Portage entirely. Hope starts further inland now, the former water-frontage streets now part of the sea.
Flights and crowds in Alaska in the summer are usually painfully expensive and painfully present, respectively. However, we seem to have found a loophole. Maybe it was fare wars between Alaska Airlines and JetBlue, but round-trip flights in August to Anchorage from Portland could be had for as little as $157. And since we opted to camp, Anchorage's seizure-inducing lodging prices were mostly avoided other than the $275 or so we had to drop on an airport Holiday Inn Express because our flight (cheap, but...) landed at 2:00 AM. The return flight was also oddly-timed, leaving at nearly 1:00 AM, but on the flip side, I got to stare, ecstatic, at 90 minutes or the northern lights through my window, pale green and beautiful.
I recommend visiting this part of Alaska in late August—it's still summer, but the locals are all starting to talk about how winter is coming. The aspens are just starting to go gold (though the alders are still summer-leafed) and the crowds seem to have tapered a little. I'd also like to tout that I didn't get a single bug bite—another benefit of the later season. Camping on the Kenai is pleasant and easy, not bush wilderness but accessible landscape, landscape far bigger than average.
]]>There's this terrible question. I didn't stop to notice how terrible it was until lately, when I was at the receiving end of it at a particularly clenchy moment.
Why didn't you just use [fill in the blank with some specific technology or framework or pattern]?
It's so easy to get to this question. A colleague or friend starts unrolling a story at you: "I spent all day ramming my head against this CORS issue" or "I could not for the life of me figure out how write a test for this thing" or "I was trying to do this thing with flexbox
and...world of pain."
You've already been there; you've already suffered this specific grief. Or, lucky you, you chanced to learn of a solution to the grief in advance of the first time you got whacked with it. You already know a safe path out of the wilderness in question.
In any case, you, at this time of hearing your cohort's misery, forget momentarily how much your own journey through the same wicked forest hurt. Or you underrate how lucky you were to know how to skirt the wicked forest safely in advance of its clutches. Because this is all behind you, you have an empathy glitch and you say:
Why didn't you just use blah with your
Access-Control-Allow-Origin
headers blah blah blah?
or
Why didn't you just use fake test timers in
sinon.js
?
or
Why didn't you just use whatever blah polyfill?
I think we mean well. We want to share our battle-worn knowledge. When we say just it also has the connotation of "I assume you already know about this thing we both use in our shared livelihoods...why didn't you just use it?"
First problem: the battle you're critiquing is in the past. Your colleague has already fought it. And come out the other side with some resolution of how she or he subdued the beast in some way. Suggesting an alternate solution is relatively moot.
It also presumes that you have consummate understanding of the whole problem and the surrounding ecosystem. Maybe you don't.
But my frowny issue with this phrase primarily centers around the wretched word just. O! Just. Perpetrator of so much evasion English, stabber of cruel clauses.
Just belittles the original problem, shrinks the monster. It presumes the conflict to be trivial and petty. It casts the person on the listening end as inferior, ignorant or both.
This is a question that's easy to trip into naturally, but should be avoided.
]]>Months and seasons have technical themes, and one of mine right now is about JavaScript bundlers—that is, tools for packaging JavaScript for use in the client (you know: browser) and handling dependencies for us.
I have had some slides in some recent talks that are especially popular, as I mentioned in my recent column on A List Apart about this subject. And when I say popular, I mean, they spurred a raucous round of groaning and cheering. I imagine this sound rush:
marks that moment when a bunch of people in the audience realized: Aha, it's not just hard for me. This is a shared suck.
A number of times recently, I have thought, OK, this is the time. I'm going to sit down and spend a while delving into the problem set of managing dependencies and packaging code and understand the not just the differences between module schemes—which in large part I do, on a syntactic, get-it-done level—but the differences in philosophies and how we got to where we are, and with luck, maybe understand why where we got is so tedious and challenging.
There are parts of this problem I find moderately interesting, but interesting in a momentary-eyebrow-raise way. Interesting in an academic sense. But the majority of it is, to me, boring. It's hard to want to dig in and figure something out that bores the socks off of me. As I say in my slides, one of the greatest technical demotivators is when something is difficult and dull at the same time:
That intersection, for me, is where this stuff ends up:
I'm not bashing the tools that do this work: they're essential and, often, sort of quietly amazing. But it just seems so hard sometimes (maybe I'm a whining, lazy developer?) and there are so many options out there, some of which are pieces of the same puzzle and some of which are mutually incompatible:
When I embarked on creating these talks and the ALA column, I thought my frustration with using these tools lay in their complexity and the dryness of what they accomplish. Getting this stuff right is often complex, and I still feel like the task isn't the most compelling, but I realized what actually gets under my skin.
This stuff makes me feel stupid.
When I spend a half an hour or more fighting a single transpiling error or a missing global, I start clenching and swearing. In my front-brain, I am angry for a weakness in a tool, a poorly-implemented wrapping node module, or the entire benighted ecosystem for its Byzantine smugness. That's why I think I'm mad.
But I now suspect I'm more angry because I feel like a fool. I want to get my head around this stuff, I want to subdue it, prevail. Even if it isn't interesting, the fact that it plays me for the fool puts me on a rampage. What's in my head is Why the hell can't I figure this out? What the hell is wrong with me. I'm in the wrong industry. What comes out of my mouth is: "Goddamit browserify, why do you suck so hard?" Which of course it does not.
So, yeah, it's boring. But that Kryptonite part is relevant. That's what kills me.
I've recognized my ire, so I hope that gives me a path to some sort of healing process. I can sip some herbal tea or something. But concretely, there are a few resources that popped up in the past week that are directly and objectively helpful.
Whether or not I'm correct in calling PostCSS a CSS post-processor or not, that is, semantic arguments notwithstanding: I love the tool and its general promise and philosophy.
I use PostCSS as part of my workflow on this site. We use it copiously at Cloud Four. One of the hints, to me, that it's a nice fit is that both dev-y and design-y people seem to thrive in its world.
Here are a couple of specific CSS things I've been enjoying writing in my source lately.
var
and calc
The combination of calc
and var
for basic computation, which is a large part of the draw of pre-processors like Sass, Less and stylus. I make use of the :root
selector as a place to stash my variables, for example:
:root {
--ratio: 1.2;
--font-size-sm: 1em;
--font-size-md: calc(var(--font-size-sm) * var(--ratio));
/* ... and so on, font and heading sizes up to xxl */
}
The above is an abbreviated riff on a pattern established by Cloud Four's Erik Jung, who is a master at typographic thinking. But note how I can chain along defined variables and calcs to create a rhythmic relationship of type. The value of ratio
is also used to calculate ideal line-height
. It's used in this site's source.
This is valid CSS per the CSS Custom Properties for Cascading Variables Module Level 1 (I call it "CSS Variables" because mouthful). But you won't see it in my site's source untrammeled because PostCSS transpiles it to the kind of CSS browsers of this day and age can handle.
Used sparingly, things like this are self-evidently useful:
@custom-selector :--headings h1, h2, h3, h4, h5, h6;
/* ...later on */
:--headings {
color: var(--heading-color);
font: var(--heading-font);
font-weight: var(--font-weight);
line-height: 1;
}
Like with CSS variables, the spec is called something different: CSS Aliases.
And how:
@custom-media --sm-viewport (width >= 30em);
/* ... and later ... */
@media (--sm-viewport) {
.Announcement {
padding: var(--space-xsm);
margin: var(--space-md) var(--space-sm);
}
}
This is per the custom media queries part of the Media Queries Level 4 spec.
Next time I'll natter on about how adding Suit CSS to this mix makes extra magic sauce.
]]>Responsive Day Out 3 in Brighton in June was all sorts of a blast. Lovely weather, nice folks, and the event was crammed with nice talks.
]]>The following is nifty ES6/JS 2015 goodness:
class Whatever {
constructor({ foo = "bar", baz = "bing" } = {}) {
// Yeah, baby
console.log(foo); // => 'bar' by default
}
}
Hat-tip to @tylersticka on that one!
]]>I have a thing for Caslon. Not just because of the old printers' adage ("When in doubt, use Caslon"), but for two other reasons:
When I first started deciding I needed a new website, which was, oh, about three decades ago, the only thing I thought was sorted was that, come hell or tsunami, I was going to use the hell out of Adobe Caslon Pro. Font payload be damned—I was willing to forego other bits and flair to make room in my performance budget. It was a lot of bytes, though, to get all the weights I needed (needed, see?). I was saddled with guilt.
Adobe, however, made this easier for me. They moved most of the useful weights of Adobe Caslon Pro into a higher price tier than I was willing to pay for on Typekit.
Then I found Playfair, the heading font on this site, and was irrevocably smitten. I switched horses to Google fonts. But whatever was I to do with my body text?
I spent a few hours squinting at the combinations of Playfair-Slabo, Playfair-Quattrocento, Playfair-Crimson Text, and so on. Between each test pairing the site would appear to me without web fonts, and then I realized, holy hell, Times New Roman looks perfectly fine. Good, even. What the hell was I doing?
Enjoy the lighter weight of not having to download another font that isn't any better than what you have available already on your system! I almost let myself get carried away with the fripperies and baubles of pretty custom fonts. I caught myself in time.
]]>My co-workers highlighted this recent video on digg, part of a series entitled "Atlas Obscura" (thus, presumably denoting its informational contents are, well, obscure), about the salad-bar poisoning by the Rajneeshees in The Dalles, Oregon in 1984.
Except this information, to me, is anything but obscure. In my memory, the Rajneesh situation flavored several years of Oregon culture and news reporting. My mom still occasionally references it in casual conversation.
It made me curious: How many people know of this event? Is it really so banished to the obscure edges of history? Is it just a sign that I'm getting old, and/or that I am hopelessly from Oregon in that entrenched way? Do I already sound like I'm my porch rocking chair, creaking on about how In my day, cults poisoned salad bars?
]]>Last month I had the honor of moderating the progressive enhancement panel at Edge Conf in London. Also: sweating and trying to look like I'm really OK despite the failure of the air conditioning at Facebook's glass-lined beautiful fishtank—I failed, to wit:
I shall give you some brief bullets as to the why:
I'm late on this. There's already been much super coverage from Jeremy and Andreas and Matthew among many others.
Video coverage is up; here's my session specifically but you can find all of 'em on the Edge Conf site.
]]>