| So, firstly, I just wanna say that I get that this program is old and appears
to not be updated anymore. That being said, it is a robust program with lots
of features/settings and because of this, I have to conclude that is possible
to achieve faster download speeds of sites. There has to be a way. I have
already achieved slightly faster speeds using some commands. To preface, I am
running Windows 10 x64. My internet connection is 1gbit download/45Mbit
upload.
I am trying to fully mirror a manga site that has at least 100,000 individual
mangas on it. What I have done so far to increase speeds marginally from base
speeds was adding the following lines in my scan rules section under options:
--disable-security-limits
--max-rate 900000
I also added: --keep-alive
because I read that that was supposed to improve speeds/reduce the number of
connections to the server.
Under limits Tab, I have tried the following: leaving the box empty, setting
to 0, and typing in 900000.
I also have Maximum Number of Links set to 9999999999, because in an 11hour
test run, WinHTTrack parsed over 1.6 million links before I paused it and if I
dont put a number in the box, it gives errors/cancels the mirror after some
time automatically.
Under the Flow Control tab, I have played around with the Number of
Connections. I have tried 3,5,8,20, and 99. With mixed results. Usually errors
when higher numbers. 5 or 6 seems to be the sweet spot that the server allows.
Persistent Connections is checked.
I have nothing in the timeout or retries boxes.
I am Storing All Files in cache, make an index is checked. I have it set to
Get HTML Files First, get non-html files related to a link is checked.
I also have the spider set to no robots.txt rules.
If anyone can expand on this or give me insight on if I am doing something
wrong, or if it possible to even be achieving megabit speeds while
downloading, I would appreciate it. I feel like this should be possible
somehow. Thanks | |