HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: To protect site against Spider engines like HTTra
Author: Kamrul Hassan Bappy
Date: 01/28/2009 22:57
 
# Robots.txt file [ Developed by : www.khola-janala.com ]
# This code will protect your site from any Grabbars & HTTRACK
# last update validated 01/02/09 by Khola-Janala.com

User-agent: httrack
Disallow: /

User-agent: NetCaptor
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: SpiderKU/0.9
Disallow: /

User-agent: Steeler
Disallow: /

User-agent: WebCopier v3.3
Disallow: /

User-agent: WebCopier v3.2a
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: webcrawler
Disallow: /

User-agent: Web Downloader/4.9
Disallow: /

User-agent: Web Downloader/5.8
Disallow: /

User-agent: WebGather 3.0
Disallow: /

User-agent: WebStripper/2.56
Disallow: /

User-agent: WebZIP/3.65
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: Wget
Disallow: /

User-agent: Zao
Disallow: /

User-agent:  Zeus 2.6
Disallow: /

User-agent: *
Disallow: /cgi-bin/
 
Reply Create subthread


All articles

Subject Author Date
To protect site against Spider engines like HTTra

11/05/2002 23:17
Re: To protect site against Spider engines like HTTra

11/05/2002 23:42
Re: To protect site against Spider engines like HT

11/05/2002 23:44
Re: To protect site against Spider engines like HT

03/30/2004 10:27
Re: To protect site against Spider engines like HTTra

01/28/2009 22:57
Re: To protect site against Spider engines like HTTra

01/28/2009 22:59




a

Created with FORUM 2.0.11