Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That article is talking about a very narrow aspect of latency, things like "if I press a letter on my keyboard, how quickly does it show up on my screen?" And most computer users care much more about throughput than latency, within reason, anyway.

But 20 years ago, starting applications was slow, because spinning-rust hard drives were slow, CPUs were slow, and we didn't have as much RAM. Yes, applications executed fewer instructions to start up, and used less RAM back then, but today I type on a 13" laptop with 64GB of RAM, so much that my OS doesn't know what to do with most of it, to the point where the OS filesystem buffer cache is "only" using up 12GB, and 44GB is just sitting completely unused for anything. Yes, maybe it's absurd that running a desktop environment with a browser (granted, with over a thousand tabs open), some terminal emulator instances, and the one annoying Electron app I can't get rid of, consumes 8GB of RAM, which alone is three orders of magnitude more RAM than I had 20 years ago.

But the bottom line is that I, as a user, just don't need to care about resource utilization. I don't need to close my word processor when I want to play a game or watch a video. Oh, speaking of that -- in 2000, my CPU didn't have the ability to decode DVD-quality MPEG2 video in real time! If I tried, CPU usage would be pegged at 100%, and the video would still stutter and be unwatchable. (I'm not saying that there weren't CPUs in 2000 that could do this, just that mine could not, and I couldn't afford one that could.) Today I can stream 4k video from some storage location tens or hundreds of miles away, and decode it (on GPU or CPU) and play it back on a machine that sits on my lap (the aforementioned 2000-era CPU of course lived in a big tower case).

"Computers were faster 20 years ago" -- yeah, right.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: