I’ve seen more 500 server errors than all of these reasons combined. I suggest arguing against JavaScript on the principle of least power, not because lightning might strike when you walk out your front door.
Half of these are errors so fundamental, that you could commit them with any technology…
Of course I think graceful degradation is nice. But if there were an actual business advantage to be made having no JavaScript at all, you’d have a lot more sites like that. The world’s most used sites have a bunch of JavaScript.
but 15 items shows you how likely it is JavaScript will not be available, or available in a limited fashion
No it doesn’t. In a few million daily page loads and less than 0,05% of my traffic without javascript, and it’s usually curl or LWP or some other scraper doing something silly. Your traffic might be different, so it’s important to measure it, then see what you want to do about it. For me, such a small number the juice probably isn’t worth the squeeze, but I have other issues with this list:
A browser extension has interfered with the sitepossible but so what? ad blockers are designed to block ads. offer scripts help users find better prices than yours. my experience is this goes wrong almost never because that makes the user more likely to uninstall the extension.
A spotty connection hasn’t loaded the dependencies correctlydon’t depend on other people’s network. If my server is up, it can serve all of its assets. “spotty” connections don’t work any other way.
Internal IT policy has blocked dependenciesagain don’t depend on other people’s network
WIFI network has blocked certain CDNsagain don’t depend on other people’s network. bandwidth is so freaking cheap you should just serve it. don’t let some third-party harvest your web visitor data to save a few pennies. They might block your images or your css as well. Your brand could look like shit for pennies.
A user is viewing your site on a train which has just gone into a tunnelpossible but so what? the user knows they are in a tunnel and will hit the refresh button on the other side.
A device doesn’t have enough memory availablepossible but I don’t know anyone with a mobile phone older than 10 years, and that’s still iphone6 era performance, maybe HTC-Ones? Test with a old android and see. This isn’t affecting me. I don’t see it even the page requests.
There’s an error in your JavaScriptyou must be joking. have you met me?
An async fetch request wasn’t fenced off in a try catch and has failedha.
A user has a JavaScript toggle accidentally turned offi don’t believe this. turn it off and ask your mom to try ten of her favourite websites. if she doesn’t ask what’s wrong with your computer, she’s not going to buy anything I sell.
A user uses a JavaScript toggle to prevent ads loadingpossible but so what? that’s what they are designed to do.
An ad blocker has blocked your JavaScript from loadingpossible but unlikely. I test with a few different adblockers and most of them are pretty good about not blocking things that aren’t ads.
A user is using Opera Minipossible. i have something like 5^-6% of my page loads are some kind of Opera, so maybe when I start making a million dollars a day fixing Opera will be worth a dollar a day, but this is not my reality, and heck, even at Google’s revenue this can’t be worth more than a grand a day to them.
A user has data saving turned onpossible but so what? i do this too and it seems fine. Try it.
Rogue, interfering scripts have been added by Google Tag managerdon’t depend on other people’s network. Google tag manager is trash; I don’t use it and I don’t recommend anyone use it. I don’t sympathise with anyone having any variant of this problem.
The browser has locked up trying to parse your JS bundlepossible which is why I like to compare js-loads against telemetry that the JS sends me.
99,67% of my js page loads include the telemetry response; I don’t believe spending any time on a js-free experience is worth anything to the business, but I appreciate it is possible it could be to someone, so I would like to understand more things to check and try, not more things that could go wrong (but won’t or don’t).
I’m not sure how to put this politely, but I seriously doubt your numbers. Bots not running headless browsers themselves should be more than what you estimated.
I’d love to know how you can be so certain in your numbers? What tools do you use or how do you measure you traffic?
In our case our audience are high school students and schools WILL occasionally block just certain resource types. Their incompetence doesn’t make it less of your problem.
will load a.js then load b.txt or c.txt based on what happened in a.js.Then because I know basic math I can compare the number of times a.js loads and the number of times b.txt loads and c.txt loads according to my logfiles.
What tools do you use or how do you measure you traffic?
Tools I build.
I buy media to these web pages and I am motivated to understand every “impression” they want to charge me for.
In our case our audience are high school students and schools WILL occasionally block just certain resource types
I think it’s important to understand the mechanism by which the sysadmin at the school makes a decision to do anything;
If you’ve hosted an old version of jquery that has some XSS vector you’ve got to expect someone is going to block jquery regexes; Even if you’ve updated the version underneath. That’s life.
The way I look at it is this: I find if people can get to Bing or Google and not get to me, that’s my problem, but if they can’t get to Bing or Google either, then they’re going to sort that out. There’s a lot that’s under my control.
A spotty connection hasn’t loaded the dependencies correctly don’t depend on other people’s network. If my server is up, it can serve all of its assets. “spotty” connections don’t work any other way.
Can I invite you to a train ride in a German train from, say, Zurich to Hamburg? Some sites work fine the whole ride, some sites really can’t deal with the connection being, well, spotty.
Some sites work fine the whole ride, some sites really can’t deal with the connection being, well, spotty.
Yeah, and I think those sites can do something about it. Third-party resources is probably the number one issue for a lot of sites. Maybe they should try the network simulator in the browser developer tools once in a while. My point is the javascriptness isn’t as big the problem as the persons writing the javascript.
I’ve recently had to use Edge to click the unsubscribe button for a newsletter, because it’s apparently my only browser not locked down enough to block the offending website’s garbage.
I know, it’s a bit of an aside, but I’d add “user has an adblocker installed” and I think they would not have managed to mess up a simple button without javascript.
I’ve seen many of these happen just at the company I’m currently at. Extensions are especially awful and constant source of errors. So are mobile browsers themselves injecting their own crap. Users don’t know what the source of breakage is even if they cared. I’d say about 90% if not more of errors we encounter are Javascript related and a similar percentage of those are not caused by code we wrote.
We still use Javascript to improve experience and I don’t see this article arguing against that. Even have a few SPAs around although those are mainly backoffice tools. However we do make sure that main functionality works even if HTML is the only thing you managed to load.
I’m building a product at the moment, a web application rather than a web site, though it involves publishing content so there’s an element of web site-ness but much of the publishing part does involve fair bits of JS to provide useful interface elements to make things smoother (think, typeahead comboboxes, chip inputs for easily adding tags, etc)
Now a lot of this can be achieved with ludicrous amounts of <form> and string parsing (tags can be done like “tag1, tag2, tag3” in a simple <input> for example) and popover menus are <a> tags that, if you have no JS, simply takes you to a new page that provides the options as <a> links.
But hot damn, it’s really hard and very time-consuming, at some point I decided the limited time I have to work on this product just isn’t worth trying to make everything progressively enhanced. It was a learning experience sure, and probably achievable by a team of more than 1, but the tradeoffs just don’t seem worth it when I look at the demographics and see mostly bog-standard mobile browsers. (the irony is not lost on me that I cannot gather demographics of users who disable JS because gathering that data requires JS…)
I decided my time is better spent optimising what’s already there, reducing the amount of JS as much as possible but not to absolute zero, improving load times and just doing all the normal clever stuff we should be doing on the web to make a good experience for all.
I do think JS can act as a way to combat most of the issues mentioned in the post - now, I have absolutely atrocious internet (side effect of living in the city of london, surrounded by quant firms taking all the fiber, I am left with ADSL 1mb/s - welcome to the centre of the finance world, we have internet from 1998) but the one product that never lets me down is Linear. It’s probably the best example of a local-first web application that behaves absolutely perfectly with constant net drops due to its CRDT/local-first architecture, which simply cannot be achieved without JS. And I’d rather have that than “Confirm form resubmission?” dialogues after a failed POST request.
(and as always, when I post a comment, I ctrl+c the post into notepad just in case it fails and I lose the text)
User is using a browser that doesn’t support JS like most TUI browsers or utralight options
User’s browser doesn’t support the version of JavaScript you happened to ship (I think about folks that are using now-unsupported Chromebooks that, without flashing a new OS, can never get another browser upgrade… similar for KaiOS phones or trying to use a retro computer)
User is using a browser that doesn’t support JS like most TUI browsers or utralight options
While I totally love browsing the web in a text browser like links2, I absolutely do not expect complex websites to work properly, and I don’t think that it makes sense to optimise for it.
I’ve seen more 500 server errors than all of these reasons combined. I suggest arguing against JavaScript on the principle of least power, not because lightning might strike when you walk out your front door.
Half of these are errors so fundamental, that you could commit them with any technology…
Of course I think graceful degradation is nice. But if there were an actual business advantage to be made having no JavaScript at all, you’d have a lot more sites like that. The world’s most used sites have a bunch of JavaScript.
I’m not a fan of JavaScript but none of these reasons have ever happened to me.
Web site owners don’t care. “Enable Javascript or GTFO” except not in so many words.
Exactly, I can already tell you what support will say when you call them because their web app is broken:
“Please clear your cookies, and try again. If it still doesn’t work, please install the latest version of Google Chrome”
No it doesn’t. In a few million daily page loads and less than 0,05% of my traffic without javascript, and it’s usually curl or LWP or some other scraper doing something silly. Your traffic might be different, so it’s important to measure it, then see what you want to do about it. For me, such a small number the juice probably isn’t worth the squeeze, but I have other issues with this list:
99,67% of my js page loads include the telemetry response; I don’t believe spending any time on a js-free experience is worth anything to the business, but I appreciate it is possible it could be to someone, so I would like to understand more things to check and try, not more things that could go wrong (but won’t or don’t).
I’m not sure how to put this politely, but I seriously doubt your numbers. Bots not running headless browsers themselves should be more than what you estimated.
I’d love to know how you can be so certain in your numbers? What tools do you use or how do you measure you traffic?
In our case our audience are high school students and schools WILL occasionally block just certain resource types. Their incompetence doesn’t make it less of your problem.
phantomjs/webdriver (incl. extra-stealth) is about 2,6% by my estimate. They load the javascript just fine.
A page that has in the
<body>
some code like this:will load a.js then load b.txt or c.txt based on what happened in a.js.Then because I know basic math I can compare the number of times a.js loads and the number of times b.txt loads and c.txt loads according to my logfiles.
Tools I build.
I buy media to these web pages and I am motivated to understand every “impression” they want to charge me for.
I think it’s important to understand the mechanism by which the sysadmin at the school makes a decision to do anything;
If you’ve hosted an old version of jquery that has some XSS vector you’ve got to expect someone is going to block jquery regexes; Even if you’ve updated the version underneath. That’s life.
The way I look at it is this: I find if people can get to Bing or Google and not get to me, that’s my problem, but if they can’t get to Bing or Google either, then they’re going to sort that out. There’s a lot that’s under my control.
Can I invite you to a train ride in a German train from, say, Zurich to Hamburg? Some sites work fine the whole ride, some sites really can’t deal with the connection being, well, spotty.
If you can host me I’m happy to come visit.
Yeah, and I think those sites can do something about it. Third-party resources is probably the number one issue for a lot of sites. Maybe they should try the network simulator in the browser developer tools once in a while. My point is the javascriptness isn’t as big the problem as the persons writing the javascript.
I would say maybe 3 of these are actually realistic.
I’ve recently had to use Edge to click the unsubscribe button for a newsletter, because it’s apparently my only browser not locked down enough to block the offending website’s garbage.
I know, it’s a bit of an aside, but I’d add “user has an adblocker installed” and I think they would not have managed to mess up a simple button without javascript.
What a depressing comment section here.
I’ve seen many of these happen just at the company I’m currently at. Extensions are especially awful and constant source of errors. So are mobile browsers themselves injecting their own crap. Users don’t know what the source of breakage is even if they cared. I’d say about 90% if not more of errors we encounter are Javascript related and a similar percentage of those are not caused by code we wrote.
We still use Javascript to improve experience and I don’t see this article arguing against that. Even have a few SPAs around although those are mainly backoffice tools. However we do make sure that main functionality works even if HTML is the only thing you managed to load.
I’m building a product at the moment, a web application rather than a web site, though it involves publishing content so there’s an element of web site-ness but much of the publishing part does involve fair bits of JS to provide useful interface elements to make things smoother (think, typeahead comboboxes, chip inputs for easily adding tags, etc)
Now a lot of this can be achieved with ludicrous amounts of
<form>
and string parsing (tags can be done like “tag1, tag2, tag3” in a simple<input>
for example) and popover menus are<a>
tags that, if you have no JS, simply takes you to a new page that provides the options as<a>
links.But hot damn, it’s really hard and very time-consuming, at some point I decided the limited time I have to work on this product just isn’t worth trying to make everything progressively enhanced. It was a learning experience sure, and probably achievable by a team of more than 1, but the tradeoffs just don’t seem worth it when I look at the demographics and see mostly bog-standard mobile browsers. (the irony is not lost on me that I cannot gather demographics of users who disable JS because gathering that data requires JS…)
I decided my time is better spent optimising what’s already there, reducing the amount of JS as much as possible but not to absolute zero, improving load times and just doing all the normal clever stuff we should be doing on the web to make a good experience for all.
I do think JS can act as a way to combat most of the issues mentioned in the post - now, I have absolutely atrocious internet (side effect of living in the city of london, surrounded by quant firms taking all the fiber, I am left with ADSL 1mb/s - welcome to the centre of the finance world, we have internet from 1998) but the one product that never lets me down is Linear. It’s probably the best example of a local-first web application that behaves absolutely perfectly with constant net drops due to its CRDT/local-first architecture, which simply cannot be achieved without JS. And I’d rather have that than “Confirm form resubmission?” dialogues after a failed POST request.
(and as always, when I post a comment, I ctrl+c the post into notepad just in case it fails and I lose the text)
While I totally love browsing the web in a text browser like links2, I absolutely do not expect complex websites to work properly, and I don’t think that it makes sense to optimise for it.
But a lot of basic, informational sites don’t work either—like a blog or landing page. The fact that these don’t work is problematic.