HTTrack Website Copier
Free software offline browser - FORUM
Subject: Question about external capturing
Author: Chris
Date: 06/28/2012 07:14
 
So I'm running a job to capture 1 website.  I left it with default limit
settings, infinite internal depth and no external depth.  It ended up being
about 200 MB total, but some of the pages wouldn't work properly.  Various
java applets and external references wouldn't display properly when viewed
offline.

I decided to run it again, with external depth set to 1.  That was earlier
today and now my job has been running for over 12 hours, with over 1.6 GBs
downloaded.

Does 1 external depth really make that much of a difference?  I can't even see
that many external links from the site I'm collecting, and yet it's capturing
hundreds of sites that have no apparent relation to the site I original set
out to collect - content or relation-wise. 
 
Reply


All articles

Subject Author Date
Question about external capturing

06/28/2012 07:14
Re: Question about external capturing

06/28/2012 07:17
Re: Question about external capturing

06/28/2012 14:20




e

Created with FORUM 2.0.11