| > Where I live there is only dial-up, so everything takes
> awhile. I have downloaded a couple large web sites
> containing valuable information. I do not understand why
> after taking two days to download a web site, doing an
> update takes just as long and appears to be redownloading
> every file.
Updates can be as long as the download, because each file
is checked remotely, and because HTTP request/response
delay is sometimes equivalent to the content being
transferred.
> When a download is interrupted and then I run
> "continue interrupted download" the files are scanned and
> the download continues where it left off without
> redownloading the files I already have. Shouldn't an
update
> only take a short while making the few additions and
> changes? This seems very bizzar to me.
HTTrack has no way to check the "freshness" using HTTP
standards. The only thing to do is to check all files, one
by one. And it can take some time..
| |