HTTrack Website Copier
Free software offline browser - FORUM
Subject: Copying single page/all dependencies (yes, again)
Author: Brian Wilson
Date: 08/03/2004 17:18
 
I am still not really understanding why this is not possible
the way things work now. Of course, I also don't really
understand the internal/external links settings that might
allow this. Sometimes I think setting external link
following to 1 should work, but it only copies the page
itself and nothing else. 

If I set it to two, that works great, but also the first
HTML page off of every link as well. 

I had a thought that you could set external link following
to 2, and then eliminate HTML link pages by doing something
like 
   -*.htm -*.html -*.asp -*.aspx
in the scan rules section...but there seems to be a clear
delineation internally between page dependencies and
external links.

Xavier mentioned that page+dependencies only was "coming
soon" when 3.32-2 came out, but I haven't heard any mention
since. Right now, that is the only thnig I want to use
httrack for, so I've been waiting for that to come out.

I've attempted the "use httrack to get too much and then
clean up afterwards", but the goal I'm going for is using
this to download entire local copies of something like CNN's
home page, and cleaning up a 2nd level httrack download
takes, well...it takes hours on something that complex. So
there is almost no gain to using httrack. 

Is there an easy way I can kludge only single page with all
page dependencies NOW with HTTrack, rather than the bulky
method I've tried before? Since there is no ETA on this
specific feature yet, I'd like to finally make some progress
on this project if possible.

Thanks,
-Brian
 
Reply


All articles

Subject Author Date
Copying single page/all dependencies (yes, again)

08/03/2004 17:18




f

Created with FORUM 2.0.11