| Hi, very nice program!
I readed the help of the program (with some problems because
my english is very bad) but I didn't find how to use it for
my proupose, so, this is my question:
It is possible to use HTTrack to fetch ONLY the URLS (whith
the arguments if are php pages) of a web recursivly and save
all in a file?
Example: I run "httrack www.mysite.com" and only generates a
file (without downloading the pages), the file will contain:
<http://www.mysite.com/index.htm>
<http://www.mysite.com/pics.php?picture=pic.jpg>
<http://www.mysite.com/pics/pics1.htm>
<http://www.mysite.com/pics/pics2.htm>
<http://www.mysite.com/pics/pics.htm>
<http://www.mysite.com/info/info.php?name=Me&page=10>
<http://www.mysite.com/info/info.php?name=Me&page=20>
It is possible?
Thanks! | |