HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: HTTrack vs webmasters
Author: Renardrouge
Date: 01/27/2003 23:03
 
> 1) A web browser does not use robots.txt

Yes, browsers are not Robots.
:-)


> and I need to capture sites in a way a user clicking 
> through a site would.

And the webmaster is agree ?Why do you need this site on your local computer
?The site will be updated the next time. It's online, and 
update, all over the world. A copy is out of date in fews 
days (or hours).

Is it for your business ?If so, you don't have a way to bypass this step ?
And you'll use the _whole_ site copied on local ?

> I do agree with you, though, Renardrogue, that HTTrack 
with 
> its default settings is quite dangerous.  In particular, 
the 
> connections / speed limit are not defined with basically 
> lets the program go all-out on a site.

> Perhaps instead of flaming, we need to constructively 
work 
> with Xavier to define some default settings, and perhaps 
> some in-program warnings when changing 'Expert options'.

ok. Lets go.
:-)
 
Reply Create subthread


All articles

Subject Author Date
HTTrack vs JOC Webspider

01/25/2003 15:30
Re: HTTrack vs JOC Webspider

01/25/2003 22:38
Re: HTTrack vs JOC Webspider

01/26/2003 08:59
Re: HTTrack vs JOC Webspider

01/26/2003 14:50
HTTrack vs webmasters

01/27/2003 14:03
Re: HTTrack vs webmasters

01/27/2003 19:10
Re: HTTrack vs webmasters

01/27/2003 21:45
Re: HTTrack vs webmasters

01/27/2003 22:46
Re: HTTrack vs webmasters

01/27/2003 23:03
Re: HTTrack vs webmasters

01/27/2003 23:22
Re: HTTrack vs webmasters

01/27/2003 23:31
Re: HTTrack vs webmasters

01/28/2003 07:23
Re: HTTrack vs webmasters

01/28/2003 22:12
Re: HTTrack vs webmasters

04/14/2005 19:22




0

Created with FORUM 2.0.11