Computer hardware is very fast now, so why does my computer lag noticeably on OS and browser operations? A facetious question, and perhaps it's not remotely a dealbreaker, but I expect better and would expect the same of my own software. I agree with GGP that too many people seem to take "premature optimization is the root of all evil" as "don't optimize until it's too late and then painstakingly get diminishing returns". There is a comfortable middle ground of the optimizing-delivering tradeoff that I think is wildly missed.
Multiple layers of complex abstractions and sloppy coding account for a lot of it. We write code as if it had no hardware constraints, so we get megabytes-size web pages running a slow interpreted language in a browser built on multiple layers of frameworks and libraries. Over the last several decades inefficient and bulky programming has consumed all of the gains in hardware, to the point that modern applications feel much slower than similar apps from the '80s.
Knuth did not have that in mind when he wrote about premature optimization.
My prediction was that we would create a cpu with a slow and limited opcode set for humans to write on line numbers. Something that is easy to work with with a good static module system with certified sniplets of code. Anything written for it would still run cirkels around higher languages despite slow clock speed. It didnt happen but still could.