| Dear Supporters,
i'm trying to download only the following dynamic pages:
<http://www.act.co.il/forums/viewtopic.php?t=1..500>
i tried these settings:
filter: +www.act.co.il/forums/viewtopic.php?t=*[1-500]
spider: no robots.txt
web address: <http://www.act.co.il/forums>
Action: Download Web Site.
but it keeps downloading me lots of files that i don't
want. i want only the 500 html files that the links:
www.act.co.il/forums/viewtopic.php?t=*[1-500] generates!
(as opening these 500 links with my browser and make: File-
Save As to each link).
please help me!
thank you!
Manu
moon@asia.com | |