|
To download all links on a single page, I think you:
1. Put page URL in list of Web Addresses for project
2. Option>>Scan Rules to only +*
3. Option>>Limits>>mirror depth >2 internal and external
(how much >2 depends on # of sub-levels you want to get)
To continue a download use * Continue Interrupted Download
-alter limits/scan rules as desired, before you continue
(What do you mean by, ' the "multiple mirror" option ' ?) | |