| Thanks for all your answers. But i think it is not a problem with robots.txt. I
have seen this issue before.
I think this is a limitation of httrack. Is it correct to say that httrack
download all the links(html,css,images,...) of the first page first(i have
-p3).
Then does httrack download all the links for each html files downloaded ?
Yesterday I tried to download 3MB and then 6MB but the quality of the link
"Postal Union Delivers Royal Mail Ultimatum " was still bad. I noticed its
because httrack was downloading images assocaited with the first page. I need
to get 50 MB before this link "Postal Union Delivers Royal Mail Ultimatum " is
fine.
Does it make sense to you this comment ? Can you confirm or infirm my remark ? | |