Disable all Javascript? No, that does not work. That's why extensions like NoScript exist, so that you can whitelist safe domains.
This is actually quite usable, but on about 3% of websites, it doesn't work. These site are usually those horrible abomination of bloat-pages that, for no good reason whatsoever, need javascript files from 20 different domains in order to display a static page (I'm looking at you, Wired.com). In these cases, it becomes too tedious to pick the domains that should be whitelisted.
Most times, I simply close the offending website. In the rare case that I actually want to visit the site, I temporarily switch to Chrome.
I still go through the hassle of using NoScript, because it un-breaks pages that would otherwise for no good reason whatsoever decide it's OK to intercept common key combinations (CTRL+t), disable right-clicks, serve pop-ups, pop-overs, or pop-unders, start playing videos or sound without me asking for that thank you very much, or generally try to hijack my browser, my data, or my computer.
> People still disable JS in 2016? I take it that 80%+ of the web is horribly broken for you guys.
Yeah, it is — but it's better to have to enable JavaScript on a one-by-one basis when desired that to travel across the Internet executing random code and impairing one's privacy.
Some websites require JavaScript to display images nowadays. What's wrong with <img>? Others require JavaScript to use the correct font. What's wrong with CSS? Still others require JavaScript to show text. What's wrong with HTML? Still others require JavaScript to build links. What's wrong with <a>?
JavaScript is destroying the Web. What was a powerful technology for disseminating formatted text across the world has become a cobbled-together GUI held together with baling wire and twine.
> Some websites require JavaScript to display images nowadays. What's wrong with <img>?
Lazy loading. By default, browsers will load all images as soon as the page loads. If you have a long article with potentially megabytes of images, this behaviour will seriously slow down the initial load, and many readers who jump early will still download a bunch of images that they never see. That's why many sites only load images as they are about to appear, but to do that, you can't use regular <img> tags.
You're right in that JavaScript shouldn't be a requirement though. A good implementation will provide a regular <img> inside a <noscript> tag.
> Others require JavaScript to use the correct font. What's wrong with CSS?
Avoiding "flashes of invisible text"[1] when using web fonts. Unfortunately, different browsers have very different strategies for loading web fonts. Some of them will wait for the web font to load at any cost rather than showing a fallback font in the meantime. This means that you can be stuck for ages with everything in place except for the text, which is seriously irritating.
I've never implemented FOIT mitigation myself, but I assume that you could (and should) again provide a fallback in a <noscript> tag. Even without it, the text will at least still display without JavaScript, just in the second font in the font stack.
> Some websites require JavaScript to display images nowadays. What's wrong with <img>?
Because trifecta of retina (hi-dpi) displays, responsive design (use the same code for both mobile and desktop) and miserly bandwidth caps, (especially on mobile networks), means that traditional <img> tags won't do it any more.
<picture> is designed to help with this - and I'm pleasantly surprised to find that it has landed in a surprising number of browsers http://caniuse.com/#feat= picture - but you'll still need a polyfill, which is if course JavaScript.
Pages with a lot of images also use lazy loading to reduce bandwidth usage even further.
Of course a <noscript> should be included, but it's getting harder and harder to make the claim that it is economical to support people without JS.
Quite the opposite actually, it's a much better experience. I avoid many of the annoyances one can encounter on a daily basis.
Most of us are likely not outright disabling, we're white-listing, it's a big difference and IMHO the best way to browse.
It's an approach that basically considers the user experience on all websites to be hostile until you decide to grant them some trust. Given that many websites do implement hostile user experiences, it works out perfect.
A blog is a collection of documents and there's no reason to require JavaScript just to view them, the web was specifically designed for displaying documents.
It is just off by default. That doesn't mean we can't enable it.
Most of the time you enable only the domain of the url. If you really need to and you trust the page, you enable everything, including the trackers, which are blocked by other means anyway.
Ignoring the people who use some sort of blocking, everyone effectively has JavaScript disabled until it loads. I don't have JavaScript disabled by default but I notice this regularly when pages either take a very long time to render or never do because something (network, origin server) has an error: