HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Too many URLs, giving up..(>100000)
Author: Christian
Date: 04/18/2015 09:44
 
Soimosan and Russ:

I have the same questions you two had. The DEv here are doing this AMAZING
project as a "charity" to average people like us, and have limited time to
answer questions. So I am going to tell you hot to fix this easily and finish
your project:

I know this is a kind of old thread and despite having been using HTTrack for
almost 7 years, I hit the frustrating "Panic: too many links" error that
prevents pages from being mirrored all the way this week (April 2015)!

But I KNOW there are average users like me that still have this problem and I
am here to help. :)

***I am going to post here the fix to this problem, easy step-by-step for
n00bs and experts alike. :) ***

I confess, I almost went crazy with frustration. HTTrack works so dang well, I
NEVER look at the error log! So for the first time I hit the "Too many URLs,
giving up" error. 

I tried everything: adding that line (example: -#L1000000) to "Rules" to no
avail. Tried the line in the "add url" and "web addresses" to no avail. I
tried deleting the dang .tmp files, not work either. I spent hours creating +*
Rules for each major link artery... nothing. 

Well... After banging my head against the wall over 3 days, I checked error
logs carefully. Ta-da!

So here is the easy way to fix this. The whole "use #L option for more links
(example: -#L1000000)" was not making sense. 

So here is what I did:

1) Do NOT delete hts-cache or .tmp files like I did! Otherwise, the download
will take a long time again from scratch. Go back to the project and open the
existing project. Select action: "Continue interrupted download";

2. Click on "Set Options" --> Limits (tab) --> "Maximum Number of links"
(bottom field) --> Just go for highest available of threads or write
999999999.

But be warned! You need to constantly check the In Progress/action download
screen to be sure you are not copying the entire Internet! lol  One way to do
that is to put "0" under the "Maximum external depth" field (Limits tab) and
checkmark "No external pages" under "Build" tab. I know this can cause you to
miss some files, but that can be fixed manually under "Rules".

Hope this helps! :) Be blessed mirroring!

THANK YOU to the crew at Httrack for this awesome project! :) I cannot write
code, but I am happy to donate yearly to keep the site/host going. :) Thanks
all. :)

Christian
 
Reply Create subthread


All articles

Subject Author Date
Too many URLs, giving up..(>100000)

05/22/2001 10:33
Re: Too many URLs, giving up..(>100000)

05/22/2001 11:01
Re: Too many URLs, giving up..(>100000)

01/29/2010 11:05
Re: Too many URLs, giving up..(>100000)

04/18/2015 09:44
Re: Too many URLs, giving up..(>100000)

03/08/2019 13:09
Re: Too many URLs, giving up..(>100000)

06/01/2019 22:21
Re: Too many URLs, giving up..(>100000)

07/14/2019 02:26
Re: Too many URLs, giving up..(>100000)

08/25/2019 21:37
Re: Too many URLs, giving up..(>100000)

02/07/2020 18:41
Re: Too many URLs, giving up..(>100000)

02/28/2020 18:21
Re: Too many URLs, giving up..(>100000)

05/19/2020 03:49
Re: Too many URLs, giving up..(>100000)

11/06/2020 21:09
Re: Too many URLs, giving up..(>100000)

11/29/2020 21:37
Re: Too many URLs, giving up..(>100000)

06/19/2021 10:17
Re: Too many URLs, giving up..(>100000)

06/19/2021 10:19
Re: Too many URLs, giving up..(>100000)

06/28/2021 14:17
Re: Too many URLs, giving up..(>100000)

07/29/2021 21:22
Re: Too many URLs, giving up..(>100000)

07/31/2023 13:29
Re: Too many URLs, giving up..(>100000)

09/14/2023 01:41
Re: Too many URLs, giving up..(>100000)

09/18/2024 22:56
Re: Too many URLs, giving up..(>100000)

10/29/2024 16:00
Re: Too many URLs, giving up..(>100000)

10/29/2024 16:01
Re: Too many URLs, giving up..(>100000)

11/11/2024 10:31




d

Created with FORUM 2.0.11