| Hi:
> I'm duplicating a very very large website and every so often it freezes.
First problem: a big website requires a large download time. Larger as bigger
is the website.
> All of the downloads stop although it still has a transfer rate stated...
Not everything is speed. I mean this is something similar to driving a
Formula1 car on a narrow mountain road. 50,000 files are a huge amount of
stress to the Net: how many time would you spend reading those files? However,
how many time do you spend downloading this website? Not very close one to
another, I think.
It's dificult to know how to manage this kind of downloads, but first of all
you have to figure out what kind of hardware does the server have, and
therefore how many connections/second (-%cN), which speed (-AN) and how many
active connections (-cN) can handle the server. Then you may think what kind
of connection do they have: fast, expensive, slow, limited, ...?
Anyway, it would be good for you to read this page first:
<http://www.httrack.com/html/abuse.html>
and then act accordingly and with extreme care with the server. Maybe try
-%c0.5 -A5000 -c1. It will be sloooow for sure (days or even weeks to fullfil)
but smarter for the server's side. | |