| Sorry, I am new to this so I do not know what is needed. I am using version
3.42-2.
1.Scan rules-selected all the file types, +*.css thru +*.wmv and added +*.swf
and +*.pdf
2.Limits-20 mirror depth, max size limits
3.Build-site structure
4.Spider-accept cookies, check doc type if unknown except/l
parse java spider-no robots.txt rules
update hacks
url hacks
5. browser ID-Mozilla 4.5
html footer <!-- Mirrored from %s%s by HTTrack Website Copier/3.x
[XR&CO'2007], %s -->
6.log,idex cache-create log files, make footer
7.experts only-use cache for updates, store all files, go down,stay on same
domain, relative url
I get about 140 files when there are about 450 needed for the whole site.
Also, I can only access the first page offline-all the rest of the pages try
to go to the internet for information.
The log file gives no hint as to what is wrong it only says the file types is
downloading and a few missing jpg files on the site which I think are valid
messages. In previous log files I picked up that the robots.txt setting
needed to be no rules.
Thanks for your help. | |