| Hi,
I'm trying to copy a site with around 3000000 links. The
version in use is 3.31.
After around 1000000 or 1100000 links all the memory is
eated, in my case 256MB, and the parsing/scanning of files
continues very slowly paging to disk.
Does Httrack have in fact a limit of links to be scanned or
a relation memory<->links?I've do several attempts but after 2 or 3 days of
machine
work the program dies peacefully without errors, without
apparent work (that is nobody is reading the HD).
Any advise/help/idea?regards
luis
| |