HTTrack Website Copier
Free software offline browser - FORUM
Subject: Panic: too many URL (> 100000)
Author: Fernando
Date: 11/23/2019 12:51
 
We are trying to archive a huge VBulletin site (more than 100GB of database
files, may be over 500 GB of html) as html static files, as we are dropping
VBulletin.

I have tried with wget, but could not get correct links.

Somebody pointed me to WinHTTrack, and it works much better, I could download
a small part of the site and have it working offline.

The problem comes when I try to download the full site.

After all night working, the program stops after downloading about 7GB files,
with a panic message: "too many urls, more than 100000 url files to
download".

It seems that it has imposed an artificial limit of 100000 files to be
processed.

Is there any way to circunvent this?
It is too slow too, but I will ask for that in a separate thread.

Thank you.
 
Reply


All articles

Subject Author Date
Panic: too many URL (> 100000)

11/23/2019 12:51
Re: Panic: too many URL (> 100000)

12/15/2019 17:35




d

Created with FORUM 2.0.11