| > I have one question. I am using httrack a list of urls
> disburdens, but he does not verify the file robots.txt of
> these urls.
robots.txt, is not disabled in the commandline of in the
GUI, are normally checked automatically for html pages
(separated files such as external gif files but without any
other html data do not trigger robots.txt)
| |