You can disable the reading of robots.txt in
Set Options / Spider / then select no robots.txt rules.
I think it may not safe to do so without filtering in the
scan rules.
add
-*/_vti*
-*/_private*
etc
in order to download what's necessary and avoid overloading
the server (you may need /images/ and /_fpclass/ )
also limit the number of connections and bandwidth usage
(otherwise you may be blocked if the webmaster is upset by
brutal mirroring of his site) |