| > 'Disallow: /' won't be recognized (too general), use
> at least folders (even multiple ones). Any agent will
> work, as HTTrack restrict the path using all found
> paths in the robots.txt file.
Oh, that isn't very nice, because I have a lot of dirs
and so I've to disallow HTTrack every dir :-(
User-agent: HTTrack
Disallow: /dir1/
Disallow: /dir2/
Disallow: /dir3/
...
> But note that robots can be disabled by the user using
> advanced options, and therefore this may not be
> sufficient.
I know, but I hope to block some HTTracker's because
the site's content is also avaible as a download file
and there is no need to use tools like HTTrack.
> See also related documentation information at:
> <http://www.httrack.com/HelpHtml/abuse.html>
I read it before.
Thanx, Stefan | |