HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: HTTrack will not copy local webpages or this URL:
Author: Xavier Roche
Date: 04/17/2003 23:57
 
> I can't seem to get HTTrack to copy the pages linked from 
> the URLs from the page...!

'Set Options' / 'Scan Rules'

And define what additional directories or domains you wish 
to download ; such as in:
+www.foo.com/subdir/*

> and I KNOW how to use this program 
> well... I set it to follow robots.text, then tried not 
> following robots.txt, set the client to MSIE 6.0 so the 
> Angelfire server won't balk, set it to go both up and 
down, 
> get html files first, get all links to a file, get all 
> filetypes I'm looking for, defaul site-structure, default 
> on tolerant requests for servers, parse javascript for 
> urls, etc.... 

Scan rules (filters) are generally sufficient, and 
therefore experts functions (do down/up..) are now useless 
and redundants (and less powerful)

> I also discovered that HTTrack doesn't know what to do 
with 
> web pages it doesn't have to download; i.e. HTML web 
pages 
> saved onto one's one hard drive! 

Html pages not caught are always rewritten with absolute 
addresses (http://)

> It would be great to be able to create a web page with 
> links, and use HTTrack to check all the links to make 
sure 
> they are all current and active ! *without having to 
first 
> upload the webpage to a server on the Web*

"Test all links" ?
 
Reply Create subthread


All articles

Subject Author Date
HTTrack will not copy local webpages or this URL:

04/17/2003 05:48
Re: HTTrack will not copy local webpages or this URL:

04/17/2003 06:55
Re: HTTrack will not copy local webpages or this URL:

04/17/2003 23:57




e

Created with FORUM 2.0.11