HTTrack Website Copier
Free software offline browser - FORUM
Subject: large websites
Author: ed carter
Date: 05/08/2004 19:02
 
I have found the problem why I have not been able to 
download the website I was having problems with. I thought 
it had to do with the transfer rate because when the rate 
was low the files appered to download and when the rate 
was high the download failed. I was not able to control 
the transfer rate using the transfer rate limit option. 
That particular issue is still unresolved unless it is 
connected to the following. The site I was trying to 
download is fairly large, about 2.7 GB and consists mostly 
of .jpg files, there are over 25,000 of them. What HTTrack 
was doing was attepting to load all the .jpg files in one 
folder. Windows cannot handle directories that large and 
when the threshold was reached the remaining files failed 
to download. The folder that contained the .jpg files had 
22,744 files in it and a size of 1.57 GB. When that limit 
was reached the remaining files, several thousand of them, 
failed top download. Perhaps HTTrack could be modified 
through a congiguration option to place large numbers of 
files in sub folders, say with a limit of 5000 files per 
folder, then it would have to keep track of everything so 
that it would properly when browsing offline. I am sure 
this would call for extensive modification but I think it 
could be done.    
 
Reply


All articles

Subject Author Date
large websites

05/08/2004 19:02
Re: large websites

05/08/2004 19:30




c

Created with FORUM 2.0.11