HTTrack Website Copier
Free software offline browser - FORUM
Subject: getting confused with the basic settings...
Author: mast
Date: 01/31/2004 23:11
hi Xavier

i used httrack a lot in the past but now i am confused since
i can't seem to be able to get what i want which sounds
fairly basic.

there's this webpage i use as a test. There's the index file
and another one "test.htm" which is "hidden" (there's no
link from index to test). The test page has a link to google... 
The scan rules are set with www.*.com/*. I tried all the

travel mode *up and down*
global travel mode *go everywhere on the web"
limits:max depth set to 3
limits:external max depth set to 3...

but when it copies the site it seems to save the index file
only and not the test.htm file.
Even though I'd set the webpage to download to it would indeed copy test.htm but
then wouldn't follow the link to google's webpage and save it.

So i guess my questions are extremely basic in regard to
httrack. What are the settings i need to use in order to get
that *get all the pages scanned, including the pages
mentionned as links in those webpagges*

i am sorry if it looks really trivial... but i really feel
frustrated not to be able to understand what I am doing ;-)
Otherwise it's a great software and I am glad you did this
work for free -really nice of you -


I hope this thread will help some other people.

All articles

Subject Author Date
getting confused with the basic settings...

01/31/2004 23:11
Re: getting confused with the basic settings...

02/01/2004 08:28
Re: getting confused with the basic settings...

02/02/2004 08:04


Created with FORUM 2.0.11