HTTrack Website Copier
Free software offline browser - FORUM
Subject: Parsing HTML & ETA Estimation.
Author: Marcus W.
Date: 03/14/2011 16:31

I'd have one more question(s), if you don't mind. I've searched this forum a
little bit, but haven't found yet any specific answers.

From what I've heard, it's obvious that copying bigger forums can take a while
and the actual operation - meant the parsing itself - can take hours, BUT...

I've been running it for 1 day and 10 hours, the current results are 749/8310
(+7558) and 3506 files. Even before I started it, I'd say, it might run for a
couple of hours, but this seems way too long.

It's a phpBB forum provided by has around 20 members registered
in it and not even 3500 posts in it. After all, it's not that enormous. 200 MB
ain't that bad either.

My questions;

1) Is it possible to make the parsing / downloading any faster? 300 B/s looks
like I'm gonna have to wait few months...

2) Parsing doesn't mean the actual downloading, right? It's only some sort of
scanning process?
3) Is it possible that the forum site was somehow overloaded by the scanning
and downloading and thus it looks like it's stuck?
4) How come that 0 to 180 MB was a matter of like 30 - 45 minutes and than it
looks like, it's stuck, but still it's doing something, but looks terribly

5) Any possible way, how to determine, how big that mirror's gonna be? I used
the default settings, wasn't messing with depth or any other things. It will
download just the page, won't go off the forum and start downloading other
sites massively, right?
I hope, this is all.

Thank you a lot for answering my questions.

Best Regards,
Marcus W.

All articles

Subject Author Date
Parsing HTML & ETA Estimation.

03/14/2011 16:31
Re: Parsing HTML & ETA Estimation.

03/14/2011 17:55
Re: Parsing HTML & ETA Estimation.

03/14/2011 22:50
Re: Parsing HTML & ETA Estimation.

03/15/2011 02:03


Created with FORUM 2.0.11