| This wGET command line grabs the first 5 pages and places them into a folder
EEE-Test in the current directory.
I put spaces in after the http and // so it won't turn into a link.
for /L %a in (1,1,5) do wget -p -E -PEEE-Test http ://
forum.eeeuser.com/viewpoll.php?id=1561^&p=%a
I tried more or less the same thing using the httrack command line but it
wouldn't work.
"C:\Program Files\WinHTTrack\httrack.exe"
-qwr0C2%Pns2u1%s%uN0%I0p3DaK0H0%kf2E180A25000%f#f
<http://forum.eeeuser.com/viewpoll.php?id=1561> -O1 "J:\Temp
Folders\wbtTemp\EEE-Test" +*.css +*.js -ad.doubleclick.net/*
-mime:application/foobar +*.gif +*.jpg +*.png +*.tif +*.bmp
gets me the first page, but then all the other pages are .readme extensions
with the following:
The file J:/Temp Folders/wbtTemp/EEE-Test/forum.eeeuser.com/viewpoll377d.html
has not been scanned by HTS
Some links contained in it may be unreachable locally.
If you want to get these files, you have to set an upper recurse level, and to
rescan the URL.
So there's some other settings to work with. | |