HTTrack Website Copier
Free software offline browser - FORUM
Subject: memory eating bug
Author: Mihkel
Date: 01/30/2009 16:59
 
Setting maximum number of links scanned to 30 000 000 (30 million) generates a
nice memory hole (capable of eating all free memory). It also might me
something to do with fact that I am running two instances of winhttrack
together, but previously it worked fine(with maximum links set lower). When
scan starts, it shows under "Links scanned:" 0/-1  . Only one link is given to
process and is correctly written and page exists.

Sadly there are some pages that have so many links(dictionaries)
 
Reply


All articles

Subject Author Date
memory eating bug

01/30/2009 16:59
Re: memory eating bug

01/30/2009 18:15
Re: memory eating bug

01/30/2009 22:08
Re: memory eating bug

02/01/2009 17:46




7

Created with FORUM 2.0.11