HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Too many robots.txt
Author: gatekat
Date: 07/23/2008 11:41
 
I'm having the same issue. It's trying to download the entire web. I have
'maximum external depth' set to 0, 'get non-HTML files related to a link'
unselected (even though I'd like it on), 'No external pages' selected, and
'Global travel mode' set to 'Stay on the same domain'. What do I need to
change to keep it on one domain?
 
Reply Create subthread


All articles

Subject Author Date
Too many robots.txt

06/29/2008 16:51
Re: Too many robots.txt

06/29/2008 20:26
Re: Too many robots.txt

07/23/2008 11:41
Re: Too many robots.txt

07/28/2008 04:24
Re: Too many robots.txt

04/18/2009 17:40




1

Created with FORUM 2.0.11