Hacker Newsnew | past | comments | ask | show | jobs | submit | irq's favoriteslogin

This will never, ever happen. English Wikipedia (w/o pictures) copressed is measly 20 GB. It is hard to quantify "all books ever written", but I have kept copies of some online libraries large enough that they for sure have pretty much every book you can remember and 10 000 you never heard of for every single book you can remember. It's not that much, you can fit it on 1 or 2 regular HDDs.

Now, I did it because I'm that type of guy. There's not that many people who actually do this bullshit, even though it's perfectly doable.

So why don't they? Because it doesn't make much sense, if you aren't afraid of upcoming nuclear winter. Wikipedia is updated and improved every day. You only sometimes want to refer to something old, but you nearly always want to check out something new. Petabytes of video are uploaded to Youtube every year. Probably TB/day wouldn't be an overestimation for audio on Spotify. All data is being updated constantly.

Also, the above is valid for pretty aggressive data compression. Is aggressively compressed data what we want? No. 2h video compressed into about 500 MB was totally fine 15 years ago. If I download a 2h movie today, it's normally around 20 GB. And by no means it's uncompressed.

Seriously, by now you should know for a fact, that if one believes there's such thing as "too much storage space" — he's stuck in the 80s.

And even if there would be such thing — realistically, a cluster of nodes in Google's datacenters can find you a book or a video you are looking for way faster than the most perfect HDD you could theoretically have locally. So, again, normal people wouldn't want to have all this stuff even if they could.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: