HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Too many URLs, giving up..(>100000)
Author: Chris K.
Date: 08/25/2019 21:37
 
I find such restrictions IDIOTIC - to say the least. This piece of idiocy code
interrupted my download in the middle of mirroring some 50000+ papers,
including their .bib files. Who are you to tell me when to stop?
Now I restarted it by adding -#L9999999 - how idiotic is that? Does it mean it
have to stop at 10 million files, but 9,999,999 are OK?
I find such artificial restrictions absolutely stupid. PLEASE remove them
ASAP! I know what I am doing, I don't need you to protect me!
 
Reply Create subthread


All articles

Subject Author Date
Too many URLs, giving up..(>100000)

05/22/2001 10:33
Re: Too many URLs, giving up..(>100000)

05/22/2001 11:01
Re: Too many URLs, giving up..(>100000)

01/29/2010 11:05
Re: Too many URLs, giving up..(>100000)

04/18/2015 09:44
Re: Too many URLs, giving up..(>100000)

03/08/2019 13:09
Re: Too many URLs, giving up..(>100000)

06/01/2019 22:21
Re: Too many URLs, giving up..(>100000)

07/14/2019 02:26
Re: Too many URLs, giving up..(>100000)

08/25/2019 21:37
Re: Too many URLs, giving up..(>100000)

02/07/2020 18:41
Re: Too many URLs, giving up..(>100000)

02/28/2020 18:21
Re: Too many URLs, giving up..(>100000)

05/19/2020 03:49
Re: Too many URLs, giving up..(>100000)

11/06/2020 21:09
Re: Too many URLs, giving up..(>100000)

11/29/2020 21:37




7

Created with FORUM 2.0.11