| Hello,
Firstly, let me compliment everyone who have made this
amazingly powerful software possible. Kudos!!
I had a few queries
1) I read from the forums that we can remove the limit of
1000000 by entering in "Web Addresses" field. I understand
that we have to enter the url followed by the command -
#L1000000.
e.x <http://www.domain.com/> -#L1000000
Am I correct?
Also, seeing from Options > Limits, I can see that the
maximum mumber in the drop down list is 50000000. Does it
mean that 50000000 is the maximum limit, or would the
option of -#L1000000 see to it that there is NO LIMIt at
all. That is what I aim at, coz the site I am downloading
has lot lot more urls than 50000000. :)
2) What exactly is the "Cache" stuff. I stopped the
downloading yesterday and started it again. It started
reading from the cache and started "parsing html"... Then
suddenly it gave me a message saying that the 1000000 limit
has been crossed and hence it has to stop. And then after
changing the options so as to increase the limit, it
started downloading the whole site from start. ?? Why is
this happening. I thought it was supposed to "Parse" the
files again and then start continue the interupted
download. Please help me.
I even checked the old.dat file but the file is just in
kilobytes. Have I done some error?
Waiting for earliest reply.
Cheers,
Lin | |