| i checked the options you mentioned in 2 and 3 but this time i had chosen
another action type: download site instead of getting seperated files.
kinda surprised how smooth it went this time, don't even need to bother
importing pagepaths
Yet next time i need to limit the checking links like www.serverbeach.com had
to manually say no don't follow all.
I just need to optimize now thx for your help
bye
my settings were:
HTTrack3.46+htsswf+htsjava launched on Wed, 29 Aug 2012 19:35:34 at
+*/*http://1.1.1.3/bmi/cdn.mangaeden.com/mangasimg/* +*.jpg
(winhttrack -WC2%Pns2u1z%s%uN0%I0p3DaK0H0%kf2A25000#L100000%f0#f -F
"Mozilla/4.5 (compatible; HTTrack 3.0x; Windows 98)" -%F "<!-- Mirrored from
%s%s by HTTrack Website Copier/3.x [XR&CO'2010], %s -->" -%l "de, en, *" -%L
[removed] -O1 D:\temp\jpg +*/*http://1.1.1.3/bmi/cdn.mangaeden.com/mangasimg/*
+*.jpg )
| |