We desperately need this and similar ecologically-sound programming philosophies in high places right now.
Just considering news--Moby Dick is 1.2mb uncompressed in plain-text. That's lower than the "average" news website by quite a bit--I just loaded the New York Times front page. It was 6.6mb. that's more than 5 copies of Moby Dick, solely for a gateway to the actual content that I want. A secondary reload was only 5mb.
I then opened a random article. The article itself was about 1,400 words long, but the page was 5.9mb. That's about 4kb per word without including the gateway (which is required if you're not using social media). Including the gateway, that's about 8kb per word, which is actually about the size of the actual content of the article itself.
So all told, to read just one article from the New York Times, I had to download the equivalent of ten copies of Moby Dick. That's about 4,600 pages. That's approaching the entirety of George R.R. Martin's A Song of Ice and Fire, without appendices.
So all told, if I check the NY Times just 4 times a day and read three articles each time, I'm downloading 100mb worth of stuff (83 Moby-Dicks) to read 72kb worth of plaintext.
Even ignoring first-principles ecological conservatism, that's just insanely inefficient and wasteful, regardless of how inexpensive bandwidth and computing power are in the west.
Faster, sure, once you have facebook loaded.
Unless, of course, you have a news app, which loads the RSS feeds, which, by nature, is way faster than both.
The NYT has photographs. Moby Dick with photographs would use more bytes. Moby Dick doesn't use photographs, because the user has to do all the work to create the images. NYT does what users want, trading cheap bytes to save human processing time.
Photographs can be scaled and compressed. The source of most megabytes visiting a website eats is the stylesheets and the JS the site itself loads, and then what these in turn load from third parties for tracking, analytics, ads, &c, and the constant communication all these scripts do with numerous servers. For most big sites, watching the network connection log using the developer tools is an enlightening experience.
Just considering news--Moby Dick is 1.2mb uncompressed in plain-text. That's lower than the "average" news website by quite a bit--I just loaded the New York Times front page. It was 6.6mb. that's more than 5 copies of Moby Dick, solely for a gateway to the actual content that I want. A secondary reload was only 5mb.
I then opened a random article. The article itself was about 1,400 words long, but the page was 5.9mb. That's about 4kb per word without including the gateway (which is required if you're not using social media). Including the gateway, that's about 8kb per word, which is actually about the size of the actual content of the article itself.
So all told, to read just one article from the New York Times, I had to download the equivalent of ten copies of Moby Dick. That's about 4,600 pages. That's approaching the entirety of George R.R. Martin's A Song of Ice and Fire, without appendices.
So all told, if I check the NY Times just 4 times a day and read three articles each time, I'm downloading 100mb worth of stuff (83 Moby-Dicks) to read 72kb worth of plaintext.
Even ignoring first-principles ecological conservatism, that's just insanely inefficient and wasteful, regardless of how inexpensive bandwidth and computing power are in the west.