| > We're running WinHTTrack Website Copier 3.42-2 and
> are having no success in limiting downloads to our
> own URL. I can't seem to find the correct settings
> to prevent spidering and downloading any linked
> external sites.
>
> I have:
> -listed the URLs in the Scan Rules window.
> -set 0 for the Max exterior depth limit
> -selected No External pages toggle in the Build tab
>
> The program is going out and spidering and
> downloading linked sites external to us. Am I
> missing something?
No External pages = checked will create a local page in case you click on an
external link. unchecked you'll hit the net.
Max exterior = 0 prevents going off site.
links -> Get non-HTML will get images, etc even if off site.
missing something? 3.4x has many problems. Try 3.32
| |