HTTrack Website Copier
Free software offline browser - FORUM
Subject: Banning robots
Author: D_A
Date: 10/30/2003 10:31
 
It also seems that some sorts of anonymous browsing is
filtered too.
What a pity !
Filtering robots seemed to be a 24-hours a day job for
webmasters (for good reasons sometimes but instead of
working on content), now it seems to be providers too
(instead of offering a faster or better service which would
stop a lot of mirrors by people who hate slow connections).
We all know that it will discourage honnest people,
identifying themselves or reading robots.txt.
What will be the next step to reduce bandwidth usage ?If malicious visitors
identify as Googlebot or any other
search engine robots, will searched engines be banned too ?I hope the future
will be better.
Anyway, I will use HTTrack to offer a zipped copy of the
static part of my site, so keep up the work.
Best regards
 
Reply Create subthread


All articles

Subject Author Date
Error: "Bad Request" (400) at link

10/29/2003 08:55
Re: Error: "Bad Request" (400) at link

10/29/2003 19:36
Banning robots

10/30/2003 10:31
Re: Banning robots

10/30/2003 11:47
Re: Error: "Bad Request" (400) at link

11/02/2003 22:49




1

Created with FORUM 2.0.11