HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Downloading picture files
Author: WHRoeder
Date: 04/04/2013 04:17
 
1) Always post the ACTUAL command line used (or log file line two) so we know
what the site is, what ALL your settings are, etc.
2) Always post the URLs you're not getting and from what URL it is
referenced.
3) Always post anything USEFUL from the log file.
4) If you want everything use the near flag (get non-html files related) not
filters.
5) I always run with A) No External Pages so I know where the mirror ends.
With B) browser ID=msie 6 pulldown as some sites don't like a HTT one. With C)
Attempt to detect all links (for JS/CSS.) With D) Timeout=60, retry=9 to avoid
temporary network interruptions from deleting files.

> settings be to download all image files only and no
> other files whatsoever. 

You put ALL the URLs in the box. If you do not have them, it can NOT be done.


You MUST let it spider the site to get them.
Nothing but html and images: -* +*.html +*.jpg ...
 
Reply Create subthread


All articles

Subject Author Date
Downloading picture files

04/04/2013 03:50
Re: Downloading picture files

04/04/2013 04:17
Re: Downloading picture files

04/04/2013 21:44
Re: Downloading picture files

04/04/2013 22:24




a

Created with FORUM 2.0.11