HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Download a forum thread
Author: WHRoeder
Date: 04/10/2013 23:28
1) Always post the ACTUAL command line used (or log file line two) so we know
what the site is, what ALL your settings are, etc.
2) Always post the URLs you're not getting and from what URL it is
3) Always post anything USEFUL from the log file.
4) If you want everything use the near flag (get non-html files related) not
5) I always run with A) No External Pages so I know where the mirror ends.
With B) browser ID=msie 6 pulldown as some sites don't like a HTT one. With C)
Attempt to detect all links (for JS/CSS.) With D) Timeout=60, retry=9 to avoid
temporary network interruptions from deleting files.

> <>
> tml
> <>
> 0.html ..

> How can I download all the pages of the thread? - If
> I try to do it, I always only get the first page. I
> also tried the filters, but it didn't make any
> difference. I only got one page.
> The hts-log says:
> 21:05:42	Info: 	Note: due to
> remote robots.txt rules, links beginning with these
> path will be forbidden: /, /adm/, /cache/, /conf/,
> /docs/, /download/, /files/, /images/, /includes/, ...

Does (non-site portion) of
> <>
start with /?Does robots.txt tell you beginning with / is forbidden?Did you
override robots?
If you only want that thread (not everything,) add Near flag and filter: -*
Reply Create subthread

All articles

Subject Author Date
Download a forum thread

04/10/2013 21:08
Re: Download a forum thread

04/10/2013 23:28


Created with FORUM 2.0.11