HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: A sample bat file
Author: Peter Baumann
Date: 09/20/2012 06:19
 
Thank you for the answer. Unfortunately, it raises more questions:

1) The manual says that -P1 is related to a proxy. What does this have to do
with the filters?
2a) The result from running this with --get does not create any sub-paths with
the urls and it only fetches 1 file per url. Both is not what is needed,
because my parsing software cannot detect to whch url a file belongs. 

2b) "Only the one page" is also not what is needed but rather all html pages
up to the second level.
I referred to this:
>rN set the mirror depth to N (* r9999) (--depth[=N]
and from this I understood that "-r2" means get all files up to the second
level.

4) Yes, I know that but I see no advantage as this only doubles the number of
file needed to run the jobs. This is because the results must not go into one
central dirctory, which could not be handled. This each job needs its own bat
file anyway.

Sorry to bug you, but how about:
- not finishing jobs
- and the missing searchable index file?
 
Reply Create subthread


All articles

Subject Author Date
A few problems with httrack

09/19/2012 05:16
A sample bat file

09/19/2012 05:21
Re: A sample bat file

09/19/2012 17:11
Re: A sample bat file

09/19/2012 17:12
Re: A sample bat file

09/20/2012 06:19
Re: A sample bat file

09/20/2012 14:31




1

Created with FORUM 2.0.11