Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To understand you would have had to be there. Computers got faster (and then some) in the old days you would write something (or not even bother) then see it took way longer than desirable. You would rewrite and itterate over possible ways to rewrite. Sometimes you would see the light, other times you would try different approches brute force. The point where optimization was nesasary was completely obvious and 99% of modern code never needs the consideration.

If communism takes over the world and be given 30 years half the texts wouldnt make sense as it talks about something to do with capitalism? that doesnt exist.



I began my programming career on machines with performance, memory, and storage constraints no one today can imagine. Some of the necessary hacks and shortcuts from back then look like premature optimization and stupid coding today.

The Y2K “problem” gives the canonical example. In a world of vast and very cheap and fast storage, it makes no sense to save two characters in a date. But back in the ‘70s and early ‘80s when I implemented dates like that cutting those two characters over a few million records saved significant money. Disk space used to cost a lot, RAM (or core memory) used to cost a lot more.


Computer hardware is very fast now, so why does my computer lag noticeably on OS and browser operations? A facetious question, and perhaps it's not remotely a dealbreaker, but I expect better and would expect the same of my own software. I agree with GGP that too many people seem to take "premature optimization is the root of all evil" as "don't optimize until it's too late and then painstakingly get diminishing returns". There is a comfortable middle ground of the optimizing-delivering tradeoff that I think is wildly missed.


Multiple layers of complex abstractions and sloppy coding account for a lot of it. We write code as if it had no hardware constraints, so we get megabytes-size web pages running a slow interpreted language in a browser built on multiple layers of frameworks and libraries. Over the last several decades inefficient and bulky programming has consumed all of the gains in hardware, to the point that modern applications feel much slower than similar apps from the '80s.

Knuth did not have that in mind when he wrote about premature optimization.


My prediction was that we would create a cpu with a slow and limited opcode set for humans to write on line numbers. Something that is easy to work with with a good static module system with certified sniplets of code. Anything written for it would still run cirkels around higher languages despite slow clock speed. It didnt happen but still could.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: