| Hi,
I've tried to copy a site with httrack but it generates too many files, so I'm
looking on how to:
- Copy a site from a list of html files (list.txt) and:
1) Include CSS, JavaScript and Images from each page in each of the URLS in
list.txt
2) Don't include pages that are linked if they are not on the list.
I'm using httrack from the command line (Mac OS).
I've made many attempts but either I could not include CSS, JS and Images or I
could not exclude html urls that are not on the list.
Thank you for your interest | |