Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, that's me. nCapture a snapshot of it, from time to time — so if it ever goes offline (or off the rails: requires a subscription, begins to serve up ads), you have the last "good" one locally.

I have a snapshot of Wikipedia as well (well, not the whole of Wikipedia, but 90GB worth).



Which Wikipedia snapshot do you grab? I keep meaning to do this, but whenever I skim the Wikipedia downloads pages, they offer hundreds of different flavors without any immediate documentation as to what differentiates the products.


You can use Kiwix: https://kiwix.org/en/


wikipedia_en_all_maxi

I guess that means English ... and maxi? As I say, was something around 90GB or so.


Was hoping you had more insight than "maxi sounds good" which is the also the best I have.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: