HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: damaged cache files
Author: richard
Date: 05/11/2005 09:06
Thanks for the response. 

Httrack is not reparing the data "it seems to hang up". I will check the
logfiles if the HTtrack job is ended correctly.

Allthough I've to crawl more then 250 sites (intranet environment) I will
split up the HTtrack job per site (within a shell script). Currently I've 6
httrack jobs for gathering the website information. In this case I will have
250 caches. You are right about mainting and rebuilding it is more easier with
one site in the cache then multiple one.

Ps: The Httrack product is great! I've worked with several commercial spiders
for gather information but I still end up by using the httrack tool. 


Reply Create subthread

All articles

Subject Author Date
damaged cache files

05/09/2005 17:20
Re: damaged cache files

05/10/2005 21:15
Re: damaged cache files

05/11/2005 09:06


Created with FORUM 2.0.11