| It also seems that some sorts of anonymous browsing is
filtered too.
What a pity !
Filtering robots seemed to be a 24-hours a day job for
webmasters (for good reasons sometimes but instead of
working on content), now it seems to be providers too
(instead of offering a faster or better service which would
stop a lot of mirrors by people who hate slow connections).
We all know that it will discourage honnest people,
identifying themselves or reading robots.txt.
What will be the next step to reduce bandwidth usage ?If malicious visitors
identify as Googlebot or any other
search engine robots, will searched engines be banned too ?I hope the future
will be better.
Anyway, I will use HTTrack to offer a zipped copy of the
static part of my site, so keep up the work.
Best regards
| |