HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: copying a punbb forum completely
Author: Gadrin
Date: 03/27/2008 21:34
 
This wGET command line grabs the first 5 pages and places them  into a folder
EEE-Test in the current directory.

I put spaces in after the http and // so it won't turn into a link.

for /L %a in (1,1,5) do wget -p -E -PEEE-Test http ://
forum.eeeuser.com/viewpoll.php?id=1561^&p=%a

I tried more or less the same thing using the httrack command line but it
wouldn't work.

"C:\Program Files\WinHTTrack\httrack.exe"
-qwr0C2%Pns2u1%s%uN0%I0p3DaK0H0%kf2E180A25000%f#f
<http://forum.eeeuser.com/viewpoll.php?id=1561> -O1 "J:\Temp
Folders\wbtTemp\EEE-Test" +*.css +*.js -ad.doubleclick.net/*
-mime:application/foobar +*.gif +*.jpg +*.png +*.tif +*.bmp    

gets me the first page, but then all the other pages are .readme extensions
with the following:

The file J:/Temp Folders/wbtTemp/EEE-Test/forum.eeeuser.com/viewpoll377d.html
has not been scanned by HTS
Some links contained in it may be unreachable locally.
If you want to get these files, you have to set an upper recurse level, and to
rescan the URL.

So there's some other settings to work with.
 
Reply Create subthread


All articles

Subject Author Date
copying a punbb forum completely

03/21/2008 22:29
Re: copying a punbb forum completely

03/26/2008 08:54
Re: copying a punbb forum completely

03/27/2008 05:53
Re: copying a punbb forum completely

03/27/2008 21:34
Re: copying a punbb forum completely

03/27/2008 21:37




3

Created with FORUM 2.0.11