| > Well, I'm trying to download all nfo-files (about
100)
> from a certain directory. Problem is, that they are
> linked to from 100 php generated html-pages. And
these
> 100 pages themselves are linked to from yet another
> php-generated page. So all the links to the nfo file
> look like 'output.php?id=1' rahter
> than 'www.server.com/nfo/info1.nfo'.
Hum, you'll have either to download all 100 pages and
associated nfo's (for example, using filters like "-*
+www.foo.com/bar.php3* +*.nfo) or find a list of all
nfos in a page. In any cases, you have to find links
to these files.
An alternative solution would be to use the 'filter
list' with a script-generated list:
i=1; while test "$i" -le 100; do
echo "www.foo.com/bar.php?id=$i"; i=$[$i+1]; done
| |