A robots.txt file is a convention used by sites to tell spiders (like google
and httrack) what files they want to allow or forbid them from spidering.
You can tell HTTrack to ignore the robots.txt file if you have a legitimate
reason to spider the site. |