HTTrack Website Copier
Free software offline browser - FORUM
Subject: selective non-parallel download
Author: robert
Date: 02/04/2004 17:38
 
I've read the tutorial and checked my options several 
times, but it's still not working.
I want to start from the URL
<http://www.newsisfree.com/sources/bycat/1>

I then want to spider through the links on that single 
page, but only the links for this directory:
<http://www.newsisfree.com/sources/info/>

I don't want to download every page in /info/, 
only the pages which are linked-to by "bycat/1".

For the resulting pages in the /info/ directory, 
I want to download only linked-to pages in this directory:
<http://www.newsisfree.com/sources/rss/>

When I try to do this in WinHTT, I immediately see that 
undesired pages are being processed, for example 
<http://www.newsisfree.com/sources/bycat/0>

Thank you.
 
Reply


All articles

Subject Author Date
selective non-parallel download

02/04/2004 17:38
Re: selective non-parallel download

02/05/2004 07:00
Re: selective non-parallel download

02/05/2004 18:37




0

Created with FORUM 2.0.11