| I want do download all image files that are stored in
a specific folder, e.g.(www.fantasya.net/galeries/1)
but I don't want to catch all the links separately to
do that.
In this case, the pages are php, e.g.
(http://fantasya.net/galerie.php?imgsection=1&ref=1)
and each image of the page are loaded by a Java script.
I already tried to use many scan rules and limits, but
all the times the program start to grab the whole site
instead of just scan and download only the images I
want.
How can I solve this problem?
PS: Sorry about my poor english, I'm brazilian :) | |