Those sound to me like the very definition of optimizations that are not premature though. His goal was to teach foundational elements of running programs on computers. Loops and sorts are used all over the place and improvements scale with inputs. I see a pretty big difference between solving for a problem you don't yet understand (the way I've always mentally framed "premature optimization") and establishing reasons for implementing primitives in optimal ways.
In my experience, the people who say"premature optimization it's the root of sal evil" will say that about any and all optimization until the point where all your customers think your product is a slow piece of garbage.
When I want some sw to go faster I do profile it, but I also give at least some thought to avoiding unnecessary copying of data or trig functions in tight loops. I regularly have run into people that say considering performance when selecting a data structure is premature optimization. It usually turns out they don't really want to test or profile and just want to churn through tickets. Anything that gets in the way of that bores them.
Measuring would attract even more protest because not only am I using my experience to choose (likely) more efficient algorithms, I am also spending yet more precious company time to measure its performance.
To understand you would have had to be there. Computers got faster (and then some) in the old days you would write something (or not even bother) then see it took way longer than desirable. You would rewrite and itterate over possible ways to rewrite. Sometimes you would see the light, other times you would try different approches brute force. The point where optimization was nesasary was completely obvious and 99% of modern code never needs the consideration.
If communism takes over the world and be given 30 years half the texts wouldnt make sense as it talks about something to do with capitalism? that doesnt exist.
I began my programming career on machines with performance, memory, and storage constraints no one today can imagine. Some of the necessary hacks and shortcuts from back then look like premature optimization and stupid coding today.
The Y2K “problem” gives the canonical example. In a world of vast and very cheap and fast storage, it makes no sense to save two characters in a date. But back in the ‘70s and early ‘80s when I implemented dates like that cutting those two characters over a few million records saved significant money. Disk space used to cost a lot, RAM (or core memory) used to cost a lot more.
Computer hardware is very fast now, so why does my computer lag noticeably on OS and browser operations? A facetious question, and perhaps it's not remotely a dealbreaker, but I expect better and would expect the same of my own software. I agree with GGP that too many people seem to take "premature optimization is the root of all evil" as "don't optimize until it's too late and then painstakingly get diminishing returns". There is a comfortable middle ground of the optimizing-delivering tradeoff that I think is wildly missed.
Multiple layers of complex abstractions and sloppy coding account for a lot of it. We write code as if it had no hardware constraints, so we get megabytes-size web pages running a slow interpreted language in a browser built on multiple layers of frameworks and libraries. Over the last several decades inefficient and bulky programming has consumed all of the gains in hardware, to the point that modern applications feel much slower than similar apps from the '80s.
Knuth did not have that in mind when he wrote about premature optimization.
My prediction was that we would create a cpu with a slow and limited opcode set for humans to write on line numbers. Something that is easy to work with with a good static module system with certified sniplets of code. Anything written for it would still run cirkels around higher languages despite slow clock speed. It didnt happen but still could.
I mean... Maybe? I guarantee that if you do any of the things he discusses in code that is getting reviewed in any company I have ever seen, you will get shot down for premature optimizations.
Just look into Gosper's hack someday and tell me that that would pass most reviews? (Which, I think is probably unfair? But I can't imagine many places being good with using standard int variables to represent subsets.)