HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Grabbing pictures only
Author: Leto
Date: 06/24/2002 01:35
> Question 1: Is there any way to estimate the size of a site before starting?
No. HTTrack will have no idea how big a site is until it has visited every
single file.

>a.What options do I need to set to capture only .jpg, jpeg, jpe.

Well, this is not really possible, as the program needs to visit all the pages
which contain links to the images which you want.  And when it goes to those
pages it will save them.

You can exclude HTML, etc, but then HTTrack won't go to those and you won't
get your pictures.

You can use a "compressed" site structure, which will divide the images and
HTML pages into separate folders, which would make it easy to just delete the
pages if not needed.

To do this, open Options > Build > Local Structure Type
Change to "Html in web/html, images/other in web/images"

>b.Can I exclude thumbnails by excluding images under 10k

Use a filter like:
-*.gif*[<10]  -*.jpg*[<10]

>b.Can I set an upper limit of say 500k

Options > Limits > Max size of any non-HTML file

>d.Can I only capture images with nudity


> Could a small window be provided to display errors as they occur?
During a capture, you can look at the log -- see under the "Log" menu.
Reply Create subthread

All articles

Subject Author Date
Grabbing pictures only

06/21/2002 21:14
Re: Grabbing pictures only

06/24/2002 01:35


Created with FORUM 2.0.11