HTTrack Website Copier
Free software offline browser - FORUM
Subject: Update of large dynamic sites.
Author: Rick
Date: 07/15/2003 21:44
 
Hello,

We have been using httrack to gather information on from 
public websites.  

Some of the websites are very large (2G) and are dynamic 
(cfm).  I only have a two hour window to update the site 
once it has been crawled?
I am controlling the updates with 

“httrack -O $site --update -E5 -%c7200” 

but this only gets a portion of the site.  I want to be 
able to get the remaining data by restarting the following 
day with something like 

“httrack -O $site --continue -E5 -%c7200 –i “ 

but it doesn’t appear to be working,  The size of the data 
isn’t increasing.  I just appears to be restarting the 
update from the beginning each time.

Thank you for your time.

Rick
 
Reply


All articles

Subject Author Date
Update of large dynamic sites.

07/15/2003 21:44
Re: Update of large dynamic sites.

07/15/2003 21:57
Re: Update of large dynamic sites.

07/29/2003 18:37




a

Created with FORUM 2.0.11