| Such mechanism would be a pain to manage, as it would
include one supplemental specific request to handle.
Robots.txt rules are generally sufficient, except with
rare stupid users who clearly want to abuse the
bandwidth. The problem, again, is not to disable one
or two securities, the problem is to disable them all
with a large pipe. This case is really hard to detect,
and I don't think that a supplemental .txt file will
solve it.
| |