Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think that would work if that page directly had references to the zip files, but it doesn't. The references are on the sub pages for each font.


Yeah but wget -m sets -l (depth limit) to unlimited, I don't see why it shouldn't work out of box. But for some reason it won't get those .zip files.


Doesn't the -A zip cause it to skip the subpages? That's what I'm saying...the filter is keeping you from traversing main page -> sub page -> zip file.


No if you check the output, or even enable -d, you'll see that it goes through sub-pages. It downloads all the index files into a tmp file so it can parse them for more sub-pages, then deletes them after because they don't match the pattern.

It even goes as far as to download download.damieng.com/fonts/conversions/Micropack.zip, proving that it does traverse domains and sub-dirs, but nothing under download.damieng.com/fonts/zx-origins.

It's really strange, I'm thinking it has something to do with all the .zip archives being linked without a protocol (//download.damieng.com), might be a bug but I'd be shocked if no one had encountered it before.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: