| I'm trying to copy the files of the following site:
<http://www.lexsoft.de/lexisnexis/justizportal_nrw.cgi?chosenIndex=Dummy_nv_68&chosenIndex=Dummy_nv_68&templateID=gliederung&tree_ordner_id=0000024:RootID&>;
(It's german, and (legally) allows to look up some important laws online - all
i want is to be able to use this service offline).
My Problem is, that HTrack now runs for almost 3 days now 24h/day, and already
copied about 13GB - but still if I open the website with HTrack in my Browser,
there are many Sites not aviable (but some work).
HTrack says it has written 45000 files, and worked 4000/50000 (+46000) (what
do these numbers mean?)
I think the problem is the navigation of this website with all its dummies,
the hyperlinks are all very long and complicated and i'm unable to find out
how i might have to change my filter to make htrack copy only the needed files
- I'm afraid it copys the same site multiple times cause its hyperlink is
different?
There were no errors occuring during runtime.
Is there any possibility to change my options/filters to decrease estimated
download time and file size? Is just "wait" a possible option, so it's just a
matter of time and HDD-size (which wouldn't be a problem)?
Thank you very much, and sorry for my lousy english - I'm from Germany, as you
might have found out, and did my best.
| |