HTTrack Website Copier
Free software offline browser - FORUM
Subject: jpg from webserver temp directory
Author: matis
Date: 06/02/2005 09:42
I'm trying to download a book from the website (using url-list from .txt -
great idea!). The problem is, that the scanned pages of the book are served
from temp directory on server - on my browser request scanned page x is sent
to a temp-directory (on server) from base-directory, and when I want to see
the next page - old scanned page is removed from temp, and the new one is
sent. Html addresses of pictures on a web pages are:temp/any_number.jpg (for
example 981,982,983,984,985 etc given for every session). And now when (after
scanning for links all my urls from the .txt file) HTTrack want to grab images
from web-site there is nothing in the temp directory - nothing but the last
image "seen" by web-copier - the rest was removed after next pages were
Is it possible in HTTrack to download all the files from the website, and just
then go to scanning next url from the list?

PS Sorry for my english.
If what I've written was completly unclear I paste the address of the book. I
hope it explain everything what I couldn't.

All articles

Subject Author Date
jpg from webserver temp directory

06/02/2005 09:42


Created with FORUM 2.0.11