| Your question is vague, please be more specific:
To download something in general:
Try using "all sites in page (multiple Mirror)"
1. Put URL(s) in list of Web Addresses for project
(ie: Online site: <http://www.example.com/index.html> )
(ie: Offline site: <file://C:\Example\index.html> )
2. Option>>Scan Rules to +* (downloads all links)
3. Option>>Limits>>mirror depth >2 internal & external
(how much >2 depends on # of sub-levels you want)
(I'm erroring on the side of downloading more)
To continue a download use * Continue Interrupted Download
-alter limits/scan rules as desired, before you continue
HTTrack follows links and saves web-pages
links to saved pages are relative, to browse offline
saved pages are viewable in any html browser
See: <http://www.httrack.com/html/index.html>
| |