HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: newbie - download list of urls
Author: William Roeder
Date: 01/11/2010 18:52
 
> I'm downloading a text file list of urls but am
Are you using Action = multiple sites not download/update

> So I set my filters to be:
> -*
> +articles.site.com/article/*.*
> +images.site.com/*.*
> +img.site.com/*.*
> -*.pdf
Don't use *.* except in a DOS CL. "*" is sufficent. If is also unnecessary
with multiple sites.

> My mirroring depth and external depth are set to 0
> because I only want the pages in the list of URLs. I
Depth=0 means everything. The first page is 1 subsequent pages, and images are
2 etc.
> have checked the "Get non-HTML files related to a
> link" checkbox.
set depth=2, get non-html and -*.pdf
 
Reply Create subthread


All articles

Subject Author Date
newbie - download list of urls

01/11/2010 12:27
Re: newbie - download list of urls

01/11/2010 18:52
Linked pages missing images??

01/13/2010 06:16
Oops.. Bill already answered.

01/13/2010 06:18
Re: Linked pages missing images??

01/13/2010 17:56
Re: newbie - download list of urls

01/13/2010 21:02




b

Created with FORUM 2.0.11