| So I check up on my httrack server after starting it about 2 days ago to see:
PANIC!" Too many URLs: >99995 [2870]service=blogger<mpl=start (18963 bytes)
- OK
Done.
Thanks for using HTTrack!
So I most likely told it to copy the internet by mistake either way here is
the code I used:
httrack <http://www.somebigblogofasite.com> -W -O "blog/big_blog_site"
--disable-security-limits --max-rate 3000000 --ext-depth 2 --socket 5
--display -F "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.4)
Gecko/2008102920 Firefox/3.0.4" --footer ""
So did I tell it to go to far? I figured download all the pages on the site
and one to two pages of the sites it references to should be sufficient but
now its terminated the command.
Thanks,
Brando753 | |