| > If I capture a site without the non html + pictures
> not checked and you decide to do it later the
> formatting of the website will not be preserved for
> offline viewing but the pictures will be added.
Don't capture the css files=no formatting
> If you just copy paste
> www.mapleprimes.com/mapleprimesforums/gethelp/howdoi
> maple?page=2
> www.mapleprimes.com/mapleprimesforums/gethelp/howdoi
> maple?page=3
> www.mapleprimes.com/mapleprimesforums/gethelp/howdoi
> maple?page=2 in the URL without a carriage return
> after each line you will get an error when you try
> to compile the webpages
return or space between urls
> So anyway I've found out I need to include page=13
> and page=22 etc... adding nine each time so each
> loaded page will have links to pages that don't
> overlap. Then I choose a depth of 3 so I can
> continue the depth into each page and read the
> answers.
There is a maximum length of content in the url box before the GUI crashes
(over 32K IIRC.)
Beyond that, you have to use a file and download all sites in pages (multiple
mirror)
<http://httrack.kauler.com/help/URL_number_sequences>
> Can't use a * int the URL it will not recognize the
> wildcard.
Of course not. What would you expect it to do, ping the server with billions
of combinations? | |