| > There is an error message about not downloading sub
> pages due to robots.txt, then refers me to options,
> but I find no options similar.
options -> Spider -> Spider
> The site is downloading, but all the links on the
> page on the local computer are to the website.
Those are probably pages excluded by robots.
I always run with options -> Build -> No error pages = checked so I get a
warning page with unmirrored links.
| |