| Hello HTTrack developers,
first let me thank you for this great program!
I have run into a bug. I'm trying to (gently) mirror a site
that has certain restrictions in hourly usage and when I hit
the limit I get 404 errors. Then I have to start the mirror
again to get further with downloading the remaining pages.
Here I've got two problems. First it re-scans the entire
site again drastically reducing the time available to
download new stuff before it shuts down again. Ideally it
would just continue with the same list of scanned pages and
outstanding pages where it was aborted the last time.
This ain't the bug however. Generally I use the option "Do
not purge old files" in the "Build" tab for this site. For
some reason this is not repected for 404 errors and it will
overwrite a perfectly downloaded page with a 404 error page
instead of leaving the good download as it is. This makes
it really hard to successfully download this specific site
with its limits because I always make two steps ahead and
one step back when it overwrites already downloaded files
with 404s from this failing session.
Is there an easy fix for this problem?
Many thanks
--
Andre
| |