HTTrack Website Copier
Free software offline browser - FORUM
Subject: Need help
Author: Allen_Cheng
Date: 01/28/2004 04:39
 
I try to download a webpage using whttrack and encounter 
the robots.txt rule. Can you tell me where I can disable 
these rules just for this case and can you tell me if it 
is safe to disable the rules here.

from log file:

Info: 	Note: due to www.asianbud.com remote robots.txt 
rules, links begining with these path will be 
forbidden: /_private/, /_vti_inf.html, /_vit_bin/, /_vit_cn
f/, /_vti_log/, /_vti_pvt/, /_vti_txt/, /_alive/, /_borders
/, /_derived/, /_fpclass/, /include/, /images/, /webstats/,
 /WebStats/, /test/, /scripts/, /New_Folder/, /cgi-bin/ 
(see in the options to disable this)

Thanks for a wonderful application (and it is free!)
 
Reply


All articles

Subject Author Date
Need help

01/28/2004 04:39
Re: Need help

01/28/2004 14:52
Re: Need help

01/29/2004 03:19




b

Created with FORUM 2.0.11