|
> > than 5400 seconds passed.. giving up' line in the log,
it
> > purged over 6000 files (previously downloaded?), all
of
>
> Yes.. this is a limit of the update system, as the time
> limiter is only interrupting brutally the transfers
without
> bothering about 'old' content.
So what is meant by "bothering about 'old' content"?
Perhaps understanding how httrack continues would help me
avoid losing files in the future... from what I see
happening, it begins parsing local files for links, gets a
couple thousand links into it, and then starts fetching
remote files (parses those while fetching?), and
occasionally goes back to the old local files, parsing and
checking that local files are actually linked to; if the
time limiter kicks in before httrack verifies that all the
local files are actually linked to, it will purge those
files it hasn't yet verified? Would it be possible to
parse/verify all local files first, then start fetching
remote files?
> By the way, the 'continue' option should have been
faster ;
> are there many errors or dynamic files on this site?It's mostly painfully
slow dynamic pages, when the site
actually works (which is why I want to mirror it). I get
series of 404 and 500 (Internal Server Error) messages
during mirroring for files that actually exist.
BTW, I tried entering a floating point (0.035)
connections/second value to lighten the load on the
server, it seems to round down to 0 which leads to
unrestricted connections/sec. | |