| > > What could I do to avoid that extreme slowdown ?> > Httrack seems to eat
all available memory and
>
> Can you give a try to 3.41-beta series ? These ones
> should be much more scalable.
Hi Xavier,
Beta-10 has been running for 3 days now with no significant loss of machine
power. Of course, what needs disk access is another story, my guess is around
60% loss but since I have 4 alive connections downloading at an average 110
Kb/s it's most acceptable.
I'm mirrorring a site close to 40Gb of data. Httrack will save me lots of
money in travelling since this free, public data was only available physically
on site or by internet. When I had to consult the data I had to go back to the
office and connect and compare with my data, take notes and return on the
road. This will now stay on my laptop. This is, AFAIK, the only mirrorring
soft that can download java generated pages of data, and probably any database
driven site.
I only hope that all the data will stay on my system when I shut down Httrack
!! because when I shut down version 3.40, first it took it almost 1.5 hours to
shut down (it scanned all stored links) and when I looked on my HD, almost
half of the data had been erased. I hope it won't do the same thing !!
One thing that's bothering, when you resume a stopped job, Httrack rescans
every single link and downloaded file and compares to the original. On a site
like I'm mirroring (~40Gb) you can imagine that this takes almost a full day
to really resume the job. Maybe it could be possible to store the state of the
job and resume from that point ! Actually it does just the same thing as if it
was updating a site.
Anyways, thanks for that great piece of soft guys !
Bill
| |