HTTrack Website Copier
Free software offline browser - FORUM
Subject: Need single site with any and all img
Author: Wes
Date: 08/03/2017 23:12
I am running into the problem of it going and downloading vast amounts of other
websites. Since a site only shows pages 1-20 and then goes up 10 each time, I
set the depth to 400 since there's a good 300 pages of threads. It is going
well but it goes off on a tangent and some how ends up on wikipedia and starts
downloading large amounts of data that I do not need.

I do not know what domain the images will be coming from but I need all the
images on the specific site I am needing to backup without getting other
"sites" (or parts of) backed up too since I only need the one specific site
backed up and any resources/images that the pages require. I am unsure how to
do this?

All articles

Subject Author Date
Need single site with any and all img 08/03/2017 23:12
Re: Need single site with any and all img 08/05/2017 22:20


Created with FORUM 2.0.11