Making Resilient Web Design work offline
I’ve written before about taking an online book offline, documenting the process behind the web version of HTML5 For Web Designers. A book is quite a static thing so it’s safe to take a fairly aggressive offline-first approach. In fact, a static unchanging book is one of the few situations that AppCache works for. Of course a service worker is better, but until AppCache is removed from browsers (and until service worker is supported across the board), I’m using both. I wouldn’t recommend that for most sites though—for most sites, use a service worker to enhance it, and avoid AppCache like the plague.
For Resilient Web Design, I took a similar approach to HTML5 For Web Designers but I knew that there was a good chance that some of the content would be getting tweaked at least for a while. So while the approach is still cache-first, I decided to keep the cache fairly fresh.
Here’s my service worker. It starts with the usual stuff: when the service worker is installed, there’s a list of static assets to cache. In this case, that list is literally everything; all the HTML, CSS, JavaScript, and images for the whole site. Again, this is a pattern that works well for a book, but wouldn’t be right for other kinds of websites.
The real heavy lifting happens with the fetch
event. This is where the logic sits for what the service worker should do everytime there’s a request for a resource. I’ve documented the logic with comments:
// Look in the cache first, fall back to the network
// CACHE
// Did we find the file in the cache?
// If so, fetch a fresh copy from the network in the background
// NETWORK
// Stash the fresh copy in the cache
// NETWORK
// If the file wasn't in the cache, make a network request
// Stash a fresh copy in the cache in the background
// OFFLINE
// If the request is for an image, show an offline placeholder
// If the request is for a page, show an offline message
So my order of preference is:
- Try the cache first,
- Try the network second,
- Fallback to a placeholder as a last resort.
Leaving aside that third part, regardless of whether the response is served straight from the cache or from the network, the cache gets a top-up. If the response is being served from the cache, there’s an additional network request made to get a fresh copy of the resource that was just served. This means that the user might be seeing a slightly stale version of a file, but they’ll get the fresher version next time round.
Again, I think this acceptable for a book where the tweaks and changes should be fairly minor, but I definitely wouldn’t want to do it on a more dynamic site where the freshness matters more.
Here’s what it usually likes like when a file is served up from the cache:
caches.match(request)
.then( responseFromCache => {
// Did we find the file in the cache?
if (responseFromCache) {
return responseFromCache;
}
I’ve introduced an extra step where the fresher version is fetched from the network. This is where the code can look a bit confusing: the network request is happening in the background after the cached file has already been returned, but the code appears before the return
statement:
caches.match(request)
.then( responseFromCache => {
// Did we find the file in the cache?
if (responseFromCache) {
// If so, fetch a fresh copy from the network in the background
event.waitUntil(
// NETWORK
fetch(request)
.then( responseFromFetch => {
// Stash the fresh copy in the cache
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseFromFetch);
});
})
);
return responseFromCache;
}
It’s asynchronous, see? So even though all that network code appears before the return
statement, it’s pretty much guaranteed to complete after the cache response has been returned. You can verify this by putting in some console.log
statements:
caches.match(request)
.then( responseFromCache => {
if (responseFromCache) {
event.waitUntil(
fetch(request)
.then( responseFromFetch => {
console.log('Got a response from the network.');
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseFromFetch);
});
})
);
console.log('Got a response from the cache.');
return responseFromCache;
}
Those log statements will appear in this order:
Got a response from the cache.
Got a response from the network.
That’s the opposite order in which they appear in the code. Everything inside the event.waitUntil
part is asynchronous.
Here’s the catch: this kind of asynchronous waitUntil
hasn’t landed in all the browsers yet. The code I’ve written will fail.
But never fear! Jake has written a polyfill. All I need to do is include that at the start of my serviceworker.js
file and I’m good to go:
// Import Jake's polyfill for async waitUntil
importScripts('/js/async-waituntil.js');
I’m also using it when a file isn’t found in the cache, and is returned from the network instead. Here’s what the usual network code looks like:
fetch(request)
.then( responseFromFetch => {
return responseFromFetch;
})
I want to also store that response in the cache, but I want to do it asynchronously—I don’t care how long it takes to put the file in the cache as long as the user gets the response straight away.
Technically, I’m not putting the response in the cache; I’m putting a copy of the response in the cache (it’s a stream, so I need to clone it if I want to do more than one thing with it).
fetch(request)
.then( responseFromFetch => {
// Stash a fresh copy in the cache in the background
let responseCopy = responseFromFetch.clone();
event.waitUntil(
caches.open(staticCacheName)
.then( cache => {
cache.put(request, responseCopy);
})
);
return responseFromFetch;
})
That all seems to be working well in browsers that support service workers. For legacy browsers, like Mobile Safari, there’s the much blunter caveman logic of an AppCache manifest.
Here’s the JavaScript that decides whether a browser gets the service worker or the AppCache:
if ('serviceWorker' in navigator) {
// If service workers are supported
navigator.serviceWorker.register('/serviceworker.js');
} else if ('applicationCache' in window) {
// Otherwise inject an iframe to use appcache
var iframe = document.createElement('iframe');
iframe.setAttribute('src', '/appcache.html');
iframe.setAttribute('style', 'width: 0; height: 0; border: 0');
document.querySelector('footer').appendChild(iframe);
}
Either way, people are making full use of the offline nature of the book and that makes me very happy indeed.
And I've been reading this book on the subway, offline. Because Service Workers. \o/
— Jen Simmons (@jensimmons) December 13, 2016
Read @adactio’s latest book for free on any device you own. It’s installable as a #PWA too (naturally). https://resilientwebdesign.com/
— Aaron Gustafson (@AaronGustafson) December 14, 2016
Upside of winter weather: Getting time to read @adactio’s beautiful new book “Resilient Web Design” on commute home https://resilientwebdesign.com/
— Craig Saila (@saila) December 14, 2016
Reached the end of a chapter while underground, saw the link to next chapter and thought “wait, what if…?”
— Sébastien Cevey (@theefer) December 15, 2016
*clicked*
Got the next page. 👏
@adactio I've been reading it in the wifi-deadzone of our kitchen in the sunshine - offline first ftw.
— Al Power (@alpower) December 17, 2016
Thank you @adactio for making the resilient web design book and offline experience. It's drastically improving my ill-prepared train journey
— Glynn Phillips (@GlynnPhillips) December 20, 2016
We moved, and I'll be sans-internet for weeks. Luckily, @adactio's book is magic, and the whole thing was cached! https://resilientwebdesign.com/
— Evan Payne (@evanfuture) December 26, 2016
App success:
— Florens Verschelde (@fvsch) December 28, 2016
- https://resilientwebdesign.com working offline out of the box, which let me read a few chapters on the train with no internet.
@DNAtkinson that’s a good read. Did the whole thing on my 28 bus journey from Brighton to Lewes. Glad @adactio made it #offlinefirst :)
— Matt (@bupk_es) January 10, 2017
@bupk_es @adactio Yes! I was perched in @Ground_Coffee Lewes—no wifi—thought I’d only be able to read 1 chapter: v gratified I cd read it all.
— David N Atkinson (@DNAtkinson) January 10, 2017