HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Question about external capturing
Author: William Roeder
Date: 06/28/2012 14:20
 
> So I'm running a job to capture 1 website.  I left
> it with default limit settings, infinite internal
> depth and no external depth.  It ended up being
> about 200 MB total, but some of the pages wouldn't
> work properly.  Various java applets and external
> references wouldn't display properly when viewed
> offline.
1) Always post the command line used (or log file line two) so we know what
you did, not what you think you did.
2) The default is to stay on site only. If you want everything, use the near
flag (get non-html files related) to get then no matter where stored.
3) A mirror is not a copy, it is a collection of files. Forms will not work
and Applets that read further data from the server will also not.

> I decided to run it again, with external depth set
> to 1.  That was earlier today and now my job has
Last I read (6 mo ago) there was a problem with external, it didn't stop.
 
Reply Create subthread


All articles

Subject Author Date
Question about external capturing

06/28/2012 07:14
Re: Question about external capturing

06/28/2012 07:17
Re: Question about external capturing

06/28/2012 14:20




f

Created with FORUM 2.0.11