HTTrack Website Copier
Free software offline browser - FORUM
Subject: Is Depth-first downloading possible
Author: Jaggu
Date: 11/05/2005 07:31

I have a number of URLs, (> 50) which I store in a file, and use URLList
option to specify the file, and download for offline access. Each of those
URLs has less than 10 other objects (HTML, GIF, JPG, JS) to download. 

However, the default mode of operation for HTTrack results in a lot of errors,
since by the time the program comes back to download the remaining files from
the original URLs, the links have "expired" and cannot be downloaded any

Of course, some of those links would have downloaded properly, but others
dont. Is there any way of forcing the program to download all the files for
each URL before going on to the next ?
Apologies if this has been answered earlier, but my searches did not yield any


All articles

Subject Author Date
Is Depth-first downloading possible

11/05/2005 07:31


Created with FORUM 2.0.11