| I am running into the problem of it going and downloading vast amounts of other
websites. Since a site only shows pages 1-20 and then goes up 10 each time, I
set the depth to 400 since there's a good 300 pages of threads. It is going
well but it goes off on a tangent and some how ends up on wikipedia and starts
downloading large amounts of data that I do not need.
I do not know what domain the images will be coming from but I need all the
images on the specific site I am needing to backup without getting other
"sites" (or parts of) backed up too since I only need the one specific site
backed up and any resources/images that the pages require. I am unsure how to
do this? | |