HTTrack Website Copier
Free software offline browser - FORUM
Subject: A real login is needed before URL is fetched.
Author: PTR-User
Date: 12/18/2009 10:12
I really need some help here... I dont get httrack working again.

What I want to do:

Download and save URLs on this site like:

- This site is updated the last time about 6 o clock in the morning... After
that time there is a reset of the site. So I have to httrack the site at about
5:50 o clock. I am not on my computer every morning... so this works with the
atd- or taskplanner-deamon.

- I need some more URLs every day like this. Unfortunately the site limits searches to about 3 or 5 a day if I am not
logged in. So I have to login somehow with httrack before the Website is

What I allready tried:

- I saved a hts-post with winhttrack and used it with the commandline-version.
But this workes just only once (httrack downloads the site then logged in...
But I think the session is closed somehow then and a new hts-posts-file would
be needed for every day). The command for this I used... was:

-O "c:\Webseiten\Tableratings\Loser_Daily_%cdate%_%ctime%\NL100" -r1 -R3 -%%P
-c4 -X0 -s0 -Q -N Daily/NL100.%%t --index

- Some time ago this worked without the limitation in searches on that site.
At this time I used this command:

-O "c:\Webseiten\Tableratings\Loser_Daily_%cdate%_%ctime%" -r1 -R3 -%%P -c4
-X0 -s0 -Q -N Daily%%p/NL%%[^stakes]^00.%%t --index

Please help me! I need some type of makro, which logs me in the site and than
fetches the URLs. Or something which records the whole login-procedure (which
is form-based if you look at the site and I need httrack to "type" the
username/password in before it begins to catch the URL) and not record just
one click like in this tutorial:


All articles

Subject Author Date
A real login is needed before URL is fetched.

12/18/2009 10:12
Re: A real login is needed before URL is fetched.

12/18/2009 14:26


Created with FORUM 2.0.11