You can use javascript to block httrack and any other webbot but you won't be indexed. You can use server side techniques but .htaccess will not be helpfull in your case. The most effective is usually to block any user (IP or session) requesting too many pages or sending too many requests instead of trying to fool a robot. You can find useful information here : <http://www.httrack.com/html/abuse.html#WEBMASTERS> <http://www.garykeith.com/browsers/bad-bots.asp> webmasterworld has many threads about blocking spiders and spider traps (using perl, php, asp...) you can look for web spider traps and and find solutions to bandwidth usage abuse. and in French <http://www.1001bd.com/stop_aspirateurs/> <http://www.toulouse-renaissance.net/c_outils/c_interdire_aspirateurs.htm>
Baldrick
David S
Dan_A
D_A
Xavier Roche
Created with FORUM 2.0.11