HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: problems at
Author: Haudy Kazemi
Date: 10/15/2002 22:38

I usually set the (internal site) mirror depth to 5 levels, 
and it was set to 5 in this case.  I don't like to leave 
this parameter blank because I don't want to get HTTrack 
permanently stuck on a recursive site.

I first tried adding a +** to the scan 
rules but it made only a tiny difference (about 4 more 
links were downloaded.

Then I removed the 5 level restriction, leaving the new 
scan rule in place, and rescanned.  This time the site 
copied correctly.

Lastly I removed that new scan rule, leaving the other 
things the same, and rescanned.  This time the site was 
also copied correctly, meaning the new scan rule I added 
was not the real fix.  The real fix was removing the 5 
mirror level limit.

The interesting thing about the 5 level limit is when you 
manually browse the site you certainly don't count 5 levels 
to the pages the HTTrack was missing.  I guess this means 
HTTrack was counting up faster than expected, thus leaving 
links out.  Is this a Javascript induced problem or is 
something else special about this site that causes HTTrack 
to count up too fast and thus miss pages?
Thanks for the clue :)

-Haudy Kazemi
Reply Create subthread

All articles

Subject Author Date
problems at

10/07/2002 08:03
Re: problems at

10/09/2002 00:07
Re: problems at

10/15/2002 22:38


Created with FORUM 2.0.11