| > Is it possible to map a website copy? That is, instead
of
> actually downloading the files, is it possible to just
get
> a list of the directory structure that would have been
> created on my system?
Err, not really - you can grab all the html structure
(which can be quite long) and analyze it - and attempt to
analyze the downloaded code.
You can also do a "regular" mirror, and exclude all non-
html files bigger than 1KB (that is, almost all non-html
files):
-* +www.thesiteyouwanttobemapped.com/foobar/* -*[>1]
+www.thesiteyouwanttobemapped.com/*.html
+www.thesiteyouwanttobemapped.com/*.php
(example if you get html and php pages - you may use asp or
cgi extension depending on your needs) | |