| Hi there,
I am trying to do something similar -- I have a text file with 38 individual
URLs listed. I want to download JUST those 38 files.
I'm brand new to this program. The call to HTTrack is inside a batch file
written by someone else and now I'm trying to use it. I suspect that the
person who set it up didn't really check what they were retrieving... or
something went wrong when I installed the software on my machine.
They originally had this:
start /w %BatchPath%httrack\httrack.exe --list %BatchPath%SiteLinks.txt -I0
(where %BatchPath% is a variable pointing to the location of the EXE and TXT
files)
The -I0 part gave an error, so I found it should have been --Index=N, and I've
been trying to figure out why it's going outside of what is listed in
SiteLinks.txt
I currently have it set to this:
%BatchPath%httrack\httrack.exe --list %BatchPath%SiteLinks.txt --priority=7
--verbose
What other params should I be using? I've looked at --get-files and it
doesn't seem restrictive enough as it says get files saved in the current
directory. I tried --stay-on-same-dir but that also doesn't seem restrictive
enough (and it sounds like the same thing???) Or --depth=1?
Suggestions / pointers greatly appreciated!
Thanks,
Beverley | |