Robots.txt is a file that admins use to keep automatic spiders from abusing
their bandwidth as well as to keep people from getting certain content
especially if it is proprietary. You can enable httrack to ignore the
robots.txt file and you should be able to dl the files you want. Cheers! |