HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: excluding files problems
Author: BzF
Date: 03/06/2002 18:46
 
> You shouldn't need to specify the server in each 
filter.
Yes, I know it, but I didn't realized it while
creating the test downloads

>  Have you tried this?:
Yes,
> -*.phtml*
> Also, does it make a difference with the assume 
option on or off?No :-(


So it doesn't work.
Now I really don't know why :-(
The only way seems to be to specify all extensions
I _want_ to download, but it isn't good way -
I always forgot to specify some of them, or
the command line is than tooooooo long.

(maybe option for loading filters from file will
be usefull (something like: --list for URI's)
(if it isn't possible to correctly specify which
files to exclude)

Regards
BzF


-----tests-----
I tried these 5 test downloads but all of them
download "unwanted" files
(commands should be on one line)

1.,
httrack --list "c:\websites\root-test.txt"
-W --depth=4 
-ext-depth=0
-O "c:\websites\root-test\01" -%v
-* -root.cz/index.* -root.cz/clanek.*
-root.cz/reklama.* +root.cz/*

2.,
httrack --list "c:\websites\root-test.txt"
-W --depth=4
--ext-depth=0
-O "c:\websites\root-test\02" -%v
-* -root.cz/index.phtml* -root.cz/clanek.phtml*
-root.cz/reklama.phtml* -root.cz/reklama.html*
+root.cz/*

3.,
httrack --list "c:\websites\root-test.txt"
-W --depth=4
--ext-depth=0
-O "c:\websites\root-test\03" -%v
-* -*.phtml* +root.cz/* --assume phtml=text/htm

4.,
httrack --list "c:\websites\root-test.txt"
-W --depth=4
--ext-depth=0
-O "c:\websites\root-test\04"  -%v
-* -*.phtml* +root.cz/*

5.,
httrack --list "c:\websites\root-test.txt"
-W --depth=4
--ext-depth=0
-O "c:\websites\root-test\05"  -%v
-* -*.phtml* -*.html* -*.htm* +root.cz/*
 
Reply Create subthread


All articles

Subject Author Date
excluding files problems

03/01/2002 21:36
Re: excluding files problems

03/03/2002 22:48
Re: excluding files problems

03/06/2002 18:46




a

Created with FORUM 2.0.11