Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They kind of do store the internet though, don't they? They store a cached version of most pages.


Tbh the source code of the whole internet would probably be a few PB at most, text is really cheap to store especially because it can be compressed. Images and videos are what makes the premise impossible because even with perfect compression you need an impossible amount of storage to store every video published by mankind.


Well, most of that is on youtube, so they kinda have a copy anyway :)


Yeah but pages are HTML and HTML compresses extremely well. With the latest algorithms you could probably get as low as one byte per page. Probably even better with a decent middle-out compression algorithm.

(Also yes, you are correct within the realm of reality, but not within the realm of comedy.)

https://en.m.wikipedia.org/wiki/Yes,_and


Just take the “middle” out if the web. If it is more than 50kb excluding images but including css and scripts, just ignore that page.


Should be noted HTML compresses extremely well.


∞ * .1 = ∞

But the 25TB you showed above is a better prospect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: