Dao Day 2024 – a regression in the making | Clagnut by Richard Rutter
The arc of the web is long and bends towards flexibility.
The arc of the web is long and bends towards flexibility.
Well, this is just wonderful! Students from Moscow Coding School are translating Resilient Web Design into Russian. Three chapters done so far!
This is literally the reason why I licensed the book with a Creative Commons Attribution‐ShareAlike license.
Today marks ten years since the publication of HTML5 For Web Designers, the very first book from A Book Apart.
I’m so proud of that book, and so honoured that I was the first author published by the web’s finest purveyors of brief books. I mean, just look at the calibre of their output since my stumbling start!
Here’s what I wrote ten years ago.
Here’s what Jason wrote ten years ago.
Here’s what Mandy wrote ten years ago.
Here’s what Jeffrey wrote ten years ago.
They started something magnificent. Ten years on, with Katel at the helm, it’s going from strength to strength.
Happy birthday, little book! And happy birthday, A Book Apart! Here’s to another decade!
Join your favorite authors on Zoom where you can have spirited discussions from the privacy of our own quarantined space!
A great initiative from the folks at Mule Design. As well as chatting to talented authors, you can also chat to me: this Thursday at 4pm UTC I’ll be discussing Resilient Web Design.
A look at the trend towards larger and larger font sizes for body copy on the web, culminating with Resilient Web Design.
There are some good arguments here for the upper limit on the font size there being too high, so I’ve adjusted it slightly. Now on large screens, the body copy on Resilient Web Design is 32px (2 times 1em), down from 40px (2.5 times 1em).
Funny because it’s true.
I was in Boston last week to give a talk. I ended up giving four.
I was there for An Event Apart which was, as always, excellent. I opened up day two with my talk, The Way Of The Web.
This was my second time giving this talk at An Event Apart—the first time was in Seattle a few months back. It was also my last time giving this talk at An Event Apart—I shan’t be speaking at any of the other AEAs this year, alas. The talk wasn’t recorded either so I’m afraid you kind of had to be there (unless you know of another conference that might like to have me give that talk, in which case, hit me up).
After giving my talk in the morning, I wasn’t quite done. I was on a panel discussion with Rachel about CSS grid. It turned out to be a pretty good format: have one person who’s a complete authority on a topic (Rachel), and another person who’s barely starting out and knows just enough to be dangerous (me). I really enjoyed it, and the questions from the audience prompted some ideas to form in my head that I should really note down in a blog post before they evaporate.
The next day, I went over to MIT to speak at Design 4 Drupal. So, y’know, technically I’ve lectured at MIT now.
I wasn’t going to do the same talk as I gave at An Event Apart, obviously. Instead, I reprised the talk I gave earlier this at Webstock: Taking Back The Web. I thought it was fitting given how much Drupal’s glorious leader, Dries, has been thinking about, writing about, and building with the indie web.
I really enjoyed giving this talk. The audience were great, and they had lots of good questions afterwards. There’s a video, which is basically my voice dubbed over the slides, followed by a good half of questions afterwards.
When I was done there, after a brief excursion to the MIT bookstore, I went back across the river from Cambridge to Boston just in time for that evening’s Boston CSS meetup.
Lea had been in touch to ask if I would speak at this meet-up, and I was only too happy to oblige. I tried doing something I’ve never done before: a book reading!
No, not reading from Going Offline, my current book which I should encouraging people to buy. Instead I read from Resilient Web Design, the free online book that people literally couldn’t buy if they wanted to. But I figured reading the philosophical ramblings in Resilient Web Design would go over better than trying to do an oral version of the service worker code in Going Offline.
I read from chapters two (Materials), three (Visions), and five (Layers) and I really, really liked doing it! People seemed to enjoy it too—we had questions throughout.
And with that, my time in Boston was at an end. I was up at the crack of dawn the next morning to get the plane back to England where Ampersand awaited. I wasn’t speaking there though. I thoroughly enjoyed being an attendee and absorbing the knowledge bombs from the brilliant speakers that Rich assembled.
The next place I’m speaking will much closer to home than Boston: I’ll be giving a short talk at Oxford Geek Nights on Wednesday. Come on by if you’re in the neighbourhood.
People of Boston: I’m doing a book reading at your CSS meet-up on Wednesday, June 27th.
(Marketing genius that I am, I won’t be reading from my newest book, which is on sale now, but from the previous book, which is available for free online.)
Tracy’s new book is excellent (and I had the great honour of writing a foreword for it).
Programmers, developers, marketers, and non-designers — want to become a better designer? This short book has everything you need.
The foreword to the self-published short book about design for non-designers.
Whenever I dipped my toe in the waters of the semantic web, I noticed there were two fundamentally different approaches. One approach was driven by the philosophy that absolutely everything in the universe should be theoretically describable. The other approach was far more lax, concentrating only on the popular use-cases: people, places, events, and that was pretty much it. These few common items, so the theory went, accounted for about 80% of actual usage in the real world. Trying to codify the remaining 20% would result in a disproportionate amount of effort.
I always liked that approach. I think it applies to a lot of endeavours. Coding, sketching, cooking—you can get up to speed on the bare essentials pretty quickly, and then spend a lifetime attaining mastery. But we don’t need to achieve mastery at every single thing we do. I’m quite happy to be just good enough at plenty of skills so that I can prioritise the things I really want to spend my time doing.
Perhaps web design isn’t a priority for you. Perhaps you’ve decided to double-down on programming. That doesn’t mean foregoing design completely. You can still design something pretty good …thanks to this book.
Tracy understands the fundamentals of web design so you don’t have to. She spent years learning, absorbing, and designing, and now she has very kindly distilled down the 80% of that knowledge that’s going to be the most useful to you.
Think of Hello Web Design as a book of cheat codes. It’s short, to the point, and tells you everything you need to know to be a perfectly competent web designer.
I’m genuinely touched that my little web book could inspire someone like this. I absolutely love reading about what people thought of the book, especially when they post on their own site like this.
This book has inspired me to approach web site building in a new way. By focusing on the core functionality and expanding it based on available features, I’ll ensure the most accessible site I can. Resilient web sites can give a core experience that’s meaningful, but progressively enhance that experience based on technical capabilities.
A jolly nice review of Resilient Web Design.
After just a few pages in, I could see why so many have read Resilient Web Design all in one go. It lives up to all the excellent reviews.
Chapter 3 of Resilient Web Design, republished in Smashing Magazine:
In the world of web design, we tend to become preoccupied with the here and now. In “Resilient Web Design“, Jeremy Keith emphasizes the importance of learning from the past in order to better prepare ourselves for the future. So, perhaps we should stop and think more beyond our present moment? The following is an excerpt from Jeremy’s web book.
I got a nice email recently from Colin van Eenige. He wrote:
For my graduation project I’m researching the development of Progressive Web Apps and found your offline book called resilient web design. I was very impressed by the implementation of the website and it really was a nice experience.
I’m very interested in your vision on progressive web apps and what capabilities are waiting for us regarding offline content. Would it be fine if I’d send you some questions?
I said that would be fine, although I couldn’t promise a swift response. He sent me four questions. I finally got ‘round to sending my answers…
Well, given the subject matter, it felt right that the canonical version of the book should be not just online, but made with the building blocks of the web. The other formats are all nice to have, but the HTML version feels (to me) like the “real” book.
Interestingly, it wasn’t too much trouble for people to generate other formats from the HTML (ePub, MOBI, PDF), whereas I think trying to go in the other direction would be trickier.
As for the offline part, that felt like a natural fit. I had already done that with a previous book of mine, HTML5 For Web Designers, which I put online a year or two after its print publication. In that case, I used AppCache for the offline functionality. AppCache is horrible, but this use case might be one of the few where it works well: a static book that’s never going to change. Cache invalidation is one of the worst parts of using AppCache so by not having any kinds of updates at all, I dodged that bullet.
But when it came time for Resilient Web Design, a service worker was definitely the right technology. Still, I’ve got AppCache in there as well for the browsers that don’t yet support service workers.
The biggest effect that service workers could have is to change the expectations that people have about using the web, especially on mobile devices. Right now, people associate the web on mobile with long waits and horrible spammy overlays. Service workers can help solve that first part.
If people then start adding sites to their home screen, that will be a great sign that the web is really holding its own. But I don’t think we should get too optimistic about that: for a user, there’s no difference between a prompt on their screen saying “add to home screen” and a prompt on their screen saying “download our app”—they’re equally likely to be dismissed because we’ve trained people to dismiss anything that covers up the content they actually came for.
It’s entirely possible that websites could start taking over much of the functionality that previously was only possible in a native app. But I think that inertia and habit will keep people using native apps for quite some time.
The big exception is in markets where storage space on devices is in short supply. That’s where the decision to install a native app isn’t taken likely (given the choice between your family photos and an app, most people will reject the app). The web can truly shine here if we build lightweight, performant services.
Even in that situation, I’m still not sure how many people will end up adding those sites to their home screen (it might feel so similar to installing a native app that there may be some residual worry about storage space) but I don’t think that’s too much of a problem: if people get to a site via search or typing, that’s fine.
I worry that the messaging around “progressive web apps” is perhaps over-fetishising the home screen. I don’t think that’s the real battleground. The real battleground is in people’s heads; how they perceive the web and how they perceive native.
After all, if the average number of native apps installed in a month is zero, then that’s not exactly a hard target to match. :-)
For me, progressive web apps don’t feel like a separate thing from making websites. I worry that the marketing of them might inflate expectations or confuse people. I like the idea that they’re simply websites that have taken their vitamins.
So my vision for progressive web apps is the same as my vision for the web: something that people use every day for all sorts of tasks.
I find it really discouraging that progressive web apps are becoming conflated with single page apps and the app shell model. Those architectural decisions have nothing to do with service workers, HTTPS, and manifest files. Yet I keep seeing the concepts used interchangeably. It would be a real shame if people chose not to use these great technologies just because they don’t classify what they’re building as an “app.”
If anything, it’s good ol’ fashioned content sites (newspapers, wikipedia, blogs, and yes, books) that can really benefit from the turbo boost of service worker+HTTPS+manifest.
I was at a conference recently where someone was given a talk encouraging people to build progressive web apps but discouraging people from doing it for their own personal sites. That’s a horrible, elitist attitude. I worry that this attitude is being codified in the term “progressive web app”.
Well, like I said, I think that some people are focusing a bit too much on the home screen and not enough on the benefits that service workers can provide to just about any website.
My biggest learning is that these technologies aren’t for a specific subset of services, but can benefit just about anything that’s on the web. I mean, just using a service worker to explicitly cache static assets like CSS, JS, and some images is a no-brainer for almost any project.
So there you go—I’m very excited about the capabilities of these technologies, but very worried about how they’re being “sold”. I’m particularly nervous that in the rush to emulate native apps, we end up losing the very thing that makes the web so powerful: URLs.
In which I attempt to answer some questions raised in the reading of Resilient Web Design.
I’ve recorded each chapter of Resilient Web Design as MP3 files that I’ve been releasing once a week. The final chapter is recorded and released so my audio work is done here.
If you want subscribe to the podcast, pop this RSS feed into your podcast software of choice. Or use one of these links:
Or if you can have it as one single MP3 file to listen to as an audio book. It’s two hours long.
So, for those keeping count, the book is now available as HTML, PDF, EPUB, MOBI, and MP3.
David picks up on one of the closing themes of Resilient Web Design—how we choose our tools. This has been on my mind a lot; it’s what I’ll be talking about at conferences this year.
That’s part of my job to ease processes and reduce frictions. That’s part of my job to take into account from the early beginning of a product its lasting qualities.
There’s a very good point here about when and how we decide to remove the things we’ve added to our projects:
We spend our time adding features without considering at the same pace the removal of useless ones. And still the true resilience (or is it perfection Antoine?) is when there is nothing more to take away. What are you removing on Monday to make our Web more resilient?
I’ve written before about taking an online book offline, documenting the process behind the web version of HTML5 For Web Designers. A book is quite a static thing so it’s safe to take a fairly aggressive offline-first approach. In fact, a static unchanging book is one of the few situations that AppCache works for. Of course a service worker is better, but until AppCache is removed from browsers (and until service worker is supported across the board), I’m using both. I wouldn’t recommend that for most sites though—for most sites, use a service worker to enhance it, and avoid AppCache like the plague.
For Resilient Web Design, I took a similar approach to HTML5 For Web Designers but I knew that there was a good chance that some of the content would be getting tweaked at least for a while. So while the approach is still cache-first, I decided to keep the cache fairly fresh.
Here’s my service worker. It starts with the usual stuff: when the service worker is installed, there’s a list of static assets to cache. In this case, that list is literally everything; all the HTML, CSS, JavaScript, and images for the whole site. Again, this is a pattern that works well for a book, but wouldn’t be right for other kinds of websites.
The real heavy lifting happens with the fetch
event. This is where the logic sits for what the service worker should do everytime there’s a request for a resource. I’ve documented the logic with comments:
// Look in the cache first, fall back to the network
// CACHE
// Did we find the file in the cache?
// If so, fetch a fresh copy from the network in the background
// NETWORK
// Stash the fresh copy in the cache
// NETWORK
// If the file wasn't in the cache, make a network request
// Stash a fresh copy in the cache in the background
// OFFLINE
// If the request is for an image, show an offline placeholder
// If the request is for a page, show an offline message
So my order of preference is:
Leaving aside that third part, regardless of whether the response is served straight from the cache or from the network, the cache gets a top-up. If the response is being served from the cache, there’s an additional network request made to get a fresh copy of the resource that was just served. This means that the user might be seeing a slightly stale version of a file, but they’ll get the fresher version next time round.
Again, I think this acceptable for a book where the tweaks and changes should be fairly minor, but I definitely wouldn’t want to do it on a more dynamic site where the freshness matters more.
Here’s what it usually likes like when a file is served up from the cache:
caches.match(request)
.then( responseFromCache => {
// Did we find the file in the cache?
if (responseFromCache) {
return responseFromCache;
}
I’ve introduced an extra step where the fresher version is fetched from the network. This is where the code can look a bit confusing: the network request is happening in the background after the cached file has already been returned, but the code appears before the return
statement:
caches.match(request)
.then( responseFromCache => {
// Did we find the file in the cache?
if (responseFromCache) {
// If so, fetch a fresh copy from the network in the background
event.waitUntil(
// NETWORK
fetch(request)
.then( responseFromFetch => {
// Stash the fresh copy in the cache
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseFromFetch);
});
})
);
return responseFromCache;
}
It’s asynchronous, see? So even though all that network code appears before the return
statement, it’s pretty much guaranteed to complete after the cache response has been returned. You can verify this by putting in some console.log
statements:
caches.match(request)
.then( responseFromCache => {
if (responseFromCache) {
event.waitUntil(
fetch(request)
.then( responseFromFetch => {
console.log('Got a response from the network.');
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseFromFetch);
});
})
);
console.log('Got a response from the cache.');
return responseFromCache;
}
Those log statements will appear in this order:
Got a response from the cache.
Got a response from the network.
That’s the opposite order in which they appear in the code. Everything inside the event.waitUntil
part is asynchronous.
Here’s the catch: this kind of asynchronous waitUntil
hasn’t landed in all the browsers yet. The code I’ve written will fail.
But never fear! Jake has written a polyfill. All I need to do is include that at the start of my serviceworker.js
file and I’m good to go:
// Import Jake's polyfill for async waitUntil
importScripts('/js/async-waituntil.js');
I’m also using it when a file isn’t found in the cache, and is returned from the network instead. Here’s what the usual network code looks like:
fetch(request)
.then( responseFromFetch => {
return responseFromFetch;
})
I want to also store that response in the cache, but I want to do it asynchronously—I don’t care how long it takes to put the file in the cache as long as the user gets the response straight away.
Technically, I’m not putting the response in the cache; I’m putting a copy of the response in the cache (it’s a stream, so I need to clone it if I want to do more than one thing with it).
fetch(request)
.then( responseFromFetch => {
// Stash a fresh copy in the cache in the background
let responseCopy = responseFromFetch.clone();
event.waitUntil(
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseCopy);
})
);
return responseFromFetch;
})
That all seems to be working well in browsers that support service workers. For legacy browsers, like Mobile Safari, there’s the much blunter caveman logic of an AppCache manifest.
Here’s the JavaScript that decides whether a browser gets the service worker or the AppCache:
if ('serviceWorker' in navigator) {
// If service workers are supported
navigator.serviceWorker.register('/serviceworker.js');
} else if ('applicationCache' in window) {
// Otherwise inject an iframe to use appcache
var iframe = document.createElement('iframe');
iframe.setAttribute('src', '/appcache.html');
iframe.setAttribute('style', 'width: 0; height: 0; border: 0');
document.querySelector('footer').appendChild(iframe);
}
Either way, people are making full use of the offline nature of the book and that makes me very happy indeed.
And I've been reading this book on the subway, offline. Because Service Workers. \o/
— Jen Simmons (@jensimmons) December 13, 2016
Read @adactio’s latest book for free on any device you own. It’s installable as a #PWA too (naturally). https://resilientwebdesign.com/
— Aaron Gustafson (@AaronGustafson) December 14, 2016
Upside of winter weather: Getting time to read @adactio’s beautiful new book “Resilient Web Design” on commute home https://resilientwebdesign.com/
— Craig Saila (@saila) December 14, 2016
Reached the end of a chapter while underground, saw the link to next chapter and thought “wait, what if…?”
— Sébastien Cevey (@theefer) December 15, 2016
*clicked*
Got the next page. 👏
@adactio I've been reading it in the wifi-deadzone of our kitchen in the sunshine - offline first ftw.
— Al Power (@alpower) December 17, 2016
Thank you @adactio for making the resilient web design book and offline experience. It's drastically improving my ill-prepared train journey
— Glynn Phillips (@GlynnPhillips) December 20, 2016
We moved, and I'll be sans-internet for weeks. Luckily, @adactio's book is magic, and the whole thing was cached! https://resilientwebdesign.com/
— Evan Payne (@evanfuture) December 26, 2016
App success:
— Florens Verschelde (@fvsch) December 28, 2016
- https://resilientwebdesign.com working offline out of the box, which let me read a few chapters on the train with no internet.
@DNAtkinson that’s a good read. Did the whole thing on my 28 bus journey from Brighton to Lewes. Glad @adactio made it #offlinefirst :)
— Matt (@bupk_es) January 10, 2017
@bupk_es @adactio Yes! I was perched in @Ground_Coffee Lewes—no wifi—thought I’d only be able to read 1 chapter: v gratified I cd read it all.
— David N Atkinson (@DNAtkinson) January 10, 2017
Well, this is nice! Susan has listed the passages she highlighted from Resilient Web Design.
In the spirit of the book, I read it in a browser, and I broke up my highlights by chapters. As usual, you should read the book yourself, these highlights are taken out of context and better when you’ve read the whole thing.
I’m really touched—and honoured—that my book could have this effect.
It made me fall back in love with the web and with making things for the web.