| I don't want a whole website copied, I just need to
see what files are there, this is from 2002, making a
shortcut with --spider, no work, don't see any options either.
Well, you can use the spider option (--spider, or in
Options/Experts Only: "store html files only") and activate
Options/Links/Test validity of all links. This will crawl
the website and check all links. To get the list, look at
the hts-cache/new.lst file. The current beta release (3.20-
b6) will also include a new.txt files which will be much
more complete (mime type, scan result, and so on..) | |