Hi, i'm downloading a webpage but Httracks get too many robots.txt files and
downloading the web takes a lot of time. I think that a photo could help to
understand me:
<http://img176.imageshack.us/my.php?image=35249545sm6.jpg>
How could I avoid to download that large amount of robots.txt? |