> been offered for download, such as zips. exe'x,
> rar's, pspimages, psptubes, png's, etc.... I don't
> want to download the whole website. Is there a way
Unless you have the urls of those files, you must let it spider the site to
get them. If you know what files you want and the extension(s) of the html
pages you can filter: -* +*.html +*.zip ...
If the html doesn't have extentions, then no.
1) Always post the command line used (or log file line two) so we know what
the site is, what your settings are, etc.
2) Always post the URLs you're not getting and from what URL.
3) Always post anything USEFUL from the log file.
4) If you want everything use the near flag (get non-html files related) not