HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: HTTrack vs webmasters
Author: Leto
Date: 01/27/2003 22:46
 
When I do my mirroring I do not have robots.txt enabled in 
HTTrack.  Because




1) A web browser does not use robots.txt and I need to 
capture sites in a way a user clicking through a site would.




2) I "browse" the target site first and analyse how to 
carefully capture it.






I do agree with you, though, Renardrogue, that HTTrack with 
its default settings is quite dangerous.  In particular, the 
connections / speed limit are not defined with basically 
lets the program go all-out on a site.




Perhaps instead of flaming, we need to constructively work 
with Xavier to define some default settings, and perhaps 
some in-program warnings when changing "Expert options".


 
Reply Create subthread


All articles

Subject Author Date
HTTrack vs JOC Webspider

01/25/2003 15:30
Re: HTTrack vs JOC Webspider

01/25/2003 22:38
Re: HTTrack vs JOC Webspider

01/26/2003 08:59
Re: HTTrack vs JOC Webspider

01/26/2003 14:50
HTTrack vs webmasters

01/27/2003 14:03
Re: HTTrack vs webmasters

01/27/2003 19:10
Re: HTTrack vs webmasters

01/27/2003 21:45
Re: HTTrack vs webmasters

01/27/2003 22:46
Re: HTTrack vs webmasters

01/27/2003 23:03
Re: HTTrack vs webmasters

01/27/2003 23:22
Re: HTTrack vs webmasters

01/27/2003 23:31
Re: HTTrack vs webmasters

01/28/2003 07:23
Re: HTTrack vs webmasters

01/28/2003 22:12
Re: HTTrack vs webmasters

04/14/2005 19:22




d

Created with FORUM 2.0.11