| We are trying to archive a huge VBulletin site (more than 100GB of database
files, may be over 500 GB of html) as html static files, as we are dropping
VBulletin.
I have tried with wget, but could not get correct links.
Somebody pointed me to WinHTTrack, and it works much better, I could download
a small part of the site and have it working offline.
The problem comes when I try to download the full site.
After all night working, the program stops after downloading about 7GB files,
with a panic message: "too many urls, more than 100000 url files to
download".
It seems that it has imposed an artificial limit of 100000 files to be
processed.
Is there any way to circunvent this?
It is too slow too, but I will ask for that in a separate thread.
Thank you. | |