HTTrack Website Copier
Free software offline browser - FORUM
Subject: Too many URLs, giving up..(>10000
Author: mikeeosa
Date: 12/03/2013 06:26
WOW, three days to download 5 gb of website only to stop with 06:48:16 Panic: 
Too many URLs, giving up..(>100000) ... the folder is now only 60 mb in size
where it was more than 5gb last night and more than 80,200 files in yes your'e
reading correct in total there should be more than 100,000 so 90% of the files
that were downloaded are gone ... do you guys think bandwidth is free all over
the world ... think again - the files are gone so is the time and bandwidth. I
watched it create these STUPID .tmp files - why cant it save and keep the HTML
files and get it done with the first time instead go through the whole process
of saving .tmp files and when it gets to 100000 stop and say sorry i'm in a
panic, my bearings are falling apart oh noooooo goodbye cheers chow etc and
close and to top it all off - REMOVE THE DOWNLOADED TMP FILES - all the files
are now gone ... missing not on my pc anymore ... are you for real people? If
I knew this I would not have wasted my time with this program. What is
-#L200000 ? where do I add what to get this thing to copy 1000000000000000
Quintilian files direct to html with no .tmp files without telling me anything
and having me to do anything ...

All articles

Subject Author Date
Too many URLs, giving up..(>10000

12/03/2013 06:26


Created with FORUM 2.0.11