There is a huge and ever-widening gap between the devices we use to make the web and the devices most people use to consume it. It’s also no secret that the average size of a website is huge, and it’s only going to get larger.
What can you do about this? Get your hands on a craptop and try to use your website or web app.
Craptops are cheap devices with lower power internals. They oftentimes come with all sorts of third-party apps preinstalled as a way to offset its cost—apps like virus scanners that are resource-intensive and difficult to remove. They’re everywhere, and they’re not going away anytime soon.
As you work your way through your website or web app, take note of:
- what loads slowly,
- what loads so slowly that it’s unusable, and
- what doesn’t even bother to load at all.
After that, formulate a plan about what to do about it.
The industry average
At the time of this post, the most common devices used to read CSS-Tricks are powerful, modern desktops, laptops, tablets, and phones with up-to-date operating systems and plenty of computational power.
Granted, not everyone who makes websites and web apps reads CSS-Tricks, but it is a very popular industry website, and I’m willing to bet its visitors are indicative of the greater whole.
In terms of performance, the qualities we can note from these devices are:
- powerful processors,
- generous amounts of RAM,
- lots of storage space,
- high-quality displays, and most likely a
- high-speed internet connection
Unfortunately, these qualities are not always found in the devices people use to access your content.
Survivor bias
British soldiers in World War I were equipped with a Brodie helmet, a steel hat designed to protect its wearer from overhead blasts and shrapnel while conducting trench warfare. After its deployment, field hospitals saw an uptick in soldiers with severe head injuries.
Because of the rise in injuries, British command considered going back to the drawing board with the helmet’s design. Fortunately, a statistician pointed out that the dramatic rise in hospital cases was because people were surviving injuries that previously would have killed them—before the introduction of steel the British Army used felt or leather as headwear material.
Survivor bias is the logical error that focuses on those who made it past a selection process. In the case of the helmet, it’s whether you’re alive or not. In the case of websites and web apps, it’s if a person can load and use your content.
Lies, damned lies, and statistics
People who can’t load your website or web app don’t show up as visitors in your analytics suite. This is straightforward enough.
However, the “use” part of “load and use your content” is the important bit here. There’s a certain percentage of devices who try to access your product that will be able to load enough of it to register a hit, but then bounce because the experience is so terrible it is effectively unusable.
Yes, I know analytics can be more sophisticated than this. But through the lens of survivor bias, is this behavior something your data is accommodating?
Blame
It’s easy to go out and get a cheap craptop and feel bad about a slow website you have no control over. The two real problems here are:
- Third-party assets, such as the very analytics and CRM packages you use to determine who is using your product and how they go about it. There’s no real control over the quality or amount of code they add to your site, and setting up the logic to block them loading their own third-party resources is difficult to do.
- The people who tell you to add these third-party assets. These people typically aren’t aware of the performance issues caused by the ask, or don’t care because it’s not part of the results they’re judged by.
What can we do about these two issues? Tie abstract, one-off business requests into something more holistic and personal.
Bear witness
I know of organizations who do things like “Testing Tuesdays,” where moderated usability testing is conducted every Tuesday. You could do the same for performance, even thread this idea into existing usability testing plans—slow websites aren’t usable, after all.
The point is to construct a regular cadence of seeing how real people actually use your website or web app, using real world devices. And when I say real world, make sure it’s not just the average version of whatever your analytics reports says.
Then make sure everyone is aware of these sessions. It’s a powerful thing to show a manager someone trying to get what they need, but can’t because of the choices your organization has made.
Craptop duty
There are roughly 260 work days in a year. That’s 260 chances to build some empathy by having someone on your development, design, marketing, or leadership team use the craptop for a day.
You can run Linux from a Windows subsystem to run most development tooling. Most other apps I’m aware of in the web-making space have a Windows installer, or can run from a browser. That should be enough to do what you need to do. And if you can’t, or it’s too slow to get done at the pace you’re accustomed to, well, that’s sort of the point.
Craptop duty, combined with usability testing with a low power device, should hopefully be enough to have those difficult conversations about what your website or web app really needs to load and why.
Don’t tokenize
The final thing I’d like to say is that it’s easy to think that the presence of a lower power device equals the presence of an economically disadvantaged person. That’s not true. Powerful devices can become circumstantially slowed by multiple factors. Wealthy individuals can, and do, use lower-power technology.
Perhaps the most important takeaway is poor people don’t deserve an inferior experience, regardless of what they are trying to do. Performant, intuitive, accessible experiences on the web are for everyone, regardless of device, ability, or circumstance.
While working for a site performance we came to know that lab data is clocking around 35.
But CrUX data is passed for origin as well as field data.
We cross verified with analysis report and found out that:
90% of traffic from mobile in that
60% is from safari
30% from chrome
10% shared by others.
All users load time is < 2.5seconds.
Optimization recommendations includes image resolutions, reduce of animation and js features integrated.
When the real world users feels the site in good speed and fast LCP, FCP, Speed Index. Is it necessary to take up the optimization activity ?
Thanks, great advice Eric!
I actually get frustrated by the other experience. I have a powerful laptop but am sometimes trapped behind a 10MBS ADSL Internet connection. I am constantly amazed by websites that load 30 or more files to display the page. They usually take more that a minute to load. To go with the crappy laptop, you should add a 10 MBS hub.
Indeed, there is a good chance they are testing on a gigabit connection these days.
I have laptops varying from Chromebooks, 7 year old Lenovo’s, a two year old XPS 13, and a Lenovo XEON workstation laptop with 64G and dual 1TB MVMEs. I don’t notices time differences any difference is webapps loading on any of them. Even the oldest machines are going to be dual core Celeron 2GhZ or an 8 core ARM.
That’s an old Eddie Van Halen/Don Landee recording studio trick.
When they were in the studio recording Van Halen songs and got to a point they thought it sounded good, Eddie would make a cassette tape of it. Out in the parking lot, they kept a shitty old Chevette with an old tape deck and broken speakers in it.
Eddie would take the tape and drive around the block listening to it in this setup. His rationale was that “this is what our target audience will be listening to our music on – not $1000 a piece studio monitor speakers”. If they can make it sound good in the Crapvette, their customers would all be happy. Not just the “nice stereo” crowd.
Apparently it worked. Great article by the way.
I love this anecdote, thank you for sharing it! I’ll definitely be listening to them today.
I am using a 2013 Mac Pro and a 2017 iPad. Typically the iPad downloads a website, and then the ads arrive over the next few minutes (using a Starlink satellite dishy.i.e. typically 122 Mbs download.)
Each time a new ad loads the composition updates and the screen jumps up or down or both, until the next ad loads, and it starts again. In short it is the slow loading ad servers that are wrecking the UI.
That is the just the net. I am amazed that it is so easy to lose data. If I am typing and I accidentally touch the screen a whole page of data will disappear. I even have to wait seconds at a time for words to appear on the screen, so last millennium!
The priority is clearly on harvesting the data, not on saving it automatically. In the 21st century losing the clients data is unforgivable.
It has been uploading to the cloud but has not been stored to local memory even though there is loads of it available. I could do this stuff on a Palm Pilot decades ago.
My impression is that Apple is expecting the rest of the world to have network speeds = to Cupertino.
The iPad freezes while waiting for the cloud so often that it is necessary to shut down and restart at least once a day. It reminds me of Windows 95.
Pardon me if my sense of entitlement is showing , but isn’t this elementary stuff?
While working on NeWS (at least), James Gosling had a practice of using Sun’s lowest-powered workstation for his development work.
What an incredibly fresh view point. Testing products on low performance devices to broaden accessibility definition.