| Ok now it worked.
First i had a directory with a dozen of subfolders for a
dozen of websites.
Httrack downloaded all (or at least multiple) at once but
bandwidth usage was rather low. After a few minutes it used
all cpu and then died.
Now i spawned one httrack instance for each of those dozen
websites. Instead of "httrack --continue" i used "httrack
--continue [url]" and i got no problems in the last 4 hours,
no cpu-eating, no quitting, and fast and complete
website-downloads.
Oh and btw, thx for this great software :) | |