HTTrack Website Copier
Free software offline browser - FORUM
Subject: Performance issue
Author: Martin Jericho
Date: 09/14/2007 04:34
 
I am using HTTrack 3.41-3 command line utility.

I am archiving a website from a webserver running on my local PC, and trying
to figure out why HTTrack takes so long to download large pages.

I am using the following command line:
httrack %ARCHIVER_SOURCE_URL% --path "%ARCHIVER_DESTINATION%" --quiet
--extra-log --sockets=1 --connection-per-second=0 --max-rate=0
--disable-security-limits --priority=7 -I0 -N100 -%%P0 --footer=""
--user-agent="Internal Archiver" -%%v1

HTTrack is only downloading files at a rate of 164KB/second, even though I
have set all the options necessary to disable all throttling, and I'm only
using a single connection so there shouldn't be any concurrency issues.

A 3.3MB file takes 21 seconds to download with HTTrack, whereas it is almost
instantaneous when loading in a browser or other program.  During the
download, the CPU is 99% idle.

Why is HTTrack so slow?
Another weird fact is that it is even slower depending on the web server.
Using tomcat or a very simple web server gives the fastest results (21
seconds), but when using OC4J standalone it blows out to 120 seconds (only
28KB/s). Still, all webservers deliver the file almost instantaneously when
loaded by a client other than HTTrack.

I would really appreciate it if someone could explain this.

Thanks
Martin
 
Reply


All articles

Subject Author Date
Performance issue

09/14/2007 04:34
Re: Performance issue

09/14/2007 17:42
Re: Performance issue

09/26/2007 06:01
Re: Performance issue

09/26/2007 16:18
Re: Performance issue

09/27/2007 01:29
Re: Performance issue

09/27/2007 02:50
Re: Performance issue

09/27/2007 08:32




1

Created with FORUM 2.0.11