| Hi,
I have a number of URLs, (> 50) which I store in a file, and use URLList
option to specify the file, and download for offline access. Each of those
URLs has less than 10 other objects (HTML, GIF, JPG, JS) to download.
However, the default mode of operation for HTTrack results in a lot of errors,
since by the time the program comes back to download the remaining files from
the original URLs, the links have "expired" and cannot be downloaded any
more.
Of course, some of those links would have downloaded properly, but others
dont. Is there any way of forcing the program to download all the files for
each URL before going on to the next ?
Apologies if this has been answered earlier, but my searches did not yield any
answers.
Regards,
Jaggu | |