| I have been trying to download a 'wiki' as well as several forum websites. In
all cases the download seems endless, with multiple copies of the same file(s)
being created. Here is an example from the new.txt:
18:23:21 211550/211550 ---M-- 200 added ('OK') text/html
date:Fri,%2029%20Nov%202013%2023:22:27%20GMT
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&diff=6582&oldid=3795>
N:/My%20Web%20Sites/themurderofmk/themurderofmeredithkercher.com/index8527.html
(from
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&offset=&limit=250&action=history>)
18:23:21 40390/40390 ---M-- 200 added ('OK') text/html
date:Fri,%2029%20Nov%202013%2023:22:28%20GMT
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&diff=3795&oldid=3730>
N:/My%20Web%20Sites/themurderofmk/themurderofmeredithkercher.com/indexeab8.html
(from
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&offset=&limit=250&action=history>)
18:23:22 37215/37215 ---M-- 200 added ('OK') text/html
date:Fri,%2029%20Nov%202013%2023:22:29%20GMT
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&oldid=3795>
N:/My%20Web%20Sites/themurderofmk/themurderofmeredithkercher.com/index02ce-2.html
(from
<http://themurderofmeredithkercher.com/index.php?title=Primary_Sources&offset=&limit=250&action=history>)
It seems that the "index.php?title=X" part leads Httrack to create separate
html files. Is there any way by using either filters, options or both, to
force Httracks to only do one copy of each file it find, rather than
multiples? Thanks in advance. | |