| Trying to download a website with a forum and a wiki that is about to close in
near future.
Forum has around 15000 topics in it with around 2 pages of posts in them on
average.
Wiki has around 1000 pages total
Website has around 1000 articles, plus all other pages, so I'd say around 1100
pages total.
I tried to start a download and it downloads 1 file per second with 20 KiB/s.
That's too slow. I googled it and
found(https://forum.httrack.com/readmsg/25863/25858/index.html) that I have to
add --disable-security-limits to the Scan Rules options. I did it:
<https://i.imgur.com/6rJ36Wy.png>
I also set maximum transfer rate to 1 MiB/s: <https://i.imgur.com/nUrTDFM.png>
It doesn't work and still downloads only at 20 KiB/s speed on average.
The website in question is <http://www.tekkenzaibatsu.com/> . The website is
just fine and I can jump from page to page in less than a second in my
browser, and these pages do have tenths of images on them, so the website
definitely can transfer data at a faster than 20 KiB/s rate.
So how do I make it to download faster? Please help and thank you. | |