> Hi im trying to download a large number of html
> pages(800K) to be exact and i have tried adding the
> list of 800k urls to httrack and it begins to work
> but it seems to go through the urls and download
> them one by one even though i have set the maximum
> number of connections at 25 (my connection speed
> isnt an issue) and i have even tried setting the
> simultaneus connections to a lower number ie 8 to
> test it out but the program always seems to just go
> through the list one by one. at this rate i have
> calculated that it would take 92 days to download
> the pages which is not really reasonable for me so
> im wondering if anyone knows of a way around this?
> thank you for your help
Im having the same problem any ideas yet besides running the program 30 times? |