HTTrack Website Copier
Free software offline browser - FORUM
Subject: 10.000 domains in list of URL and full STOP
Author: Yegg
Date: 11/24/2003 03:02
 
WEll, I have a file, with 10.000 domains, and run httrack 
to download all this sites with depth=0. But, the speed is 
about 20kb-30kb/s for 10-15 seconds and after full STOP. 
All threads hanging. I'am tried to make less domains like 
200 or 300, and the same. I tried it on Linux 7.3 with 2.6 
kernel and 10 MB/s line, also tried on Windows2k and XP 
with 1MB/s line. And the same problem. How to fix it??? 
E.g. when I'am runing wget with just one threads, it's 
downloading file after file, with good speed. But your 
httrack, totally don't want to work. If I'am trying to 
download any site, e.g. www.yandex.ru - the speed is good, 
like 930 Kb/s on Linux server, but when are a lot of 
domains....it's totaly dont' want work.
 
Reply


All articles

Subject Author Date
10.000 domains in list of URL and full STOP

11/24/2003 03:02
Re: 10.000 domains in list of URL and full STOP

11/24/2003 13:38
Re: 10.000 domains in list of URL and full STOP

11/24/2003 14:37




2

Created with FORUM 2.0.11