| > I want to list all the required links which is
> present on web site , with out downloading any
> thing.I just need the list of all links.
You can't know what is on a site without downloading its pages.
If you filter out everything but the html, set no external pages and mirror,
you can get what you want by parsing the files:
Name of all pages: findstr "Mirrored from" *
Name of externals: findstr "external.html" *
edit the result of pass through sed (http://unxutils.sourceforge.net/)
| |