HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: .readme
Author: Xavier Roche
Date: 07/02/2001 21:48
 
> But the idea is that I dont want any recurcy i.e. 
the 
> deep level is always 0, as I work with a fixed list 
of 
> URLs. Isn't any way aaround it anyway?
Well, 0 depth? You only want to download listed URLs, 
and that's all (no links after?)

In this case use:
-*

as filter, it will only download primary (URLs given) 
URLs

 
Reply Create subthread


All articles

Subject Author Date
.readme

07/02/2001 10:41
Re: .readme

07/02/2001 14:42
Re: .readme

07/02/2001 15:57
Re: .readme

07/02/2001 21:48




b

Created with FORUM 2.0.11