| I'd like to copy a site were I put out a search request and get a jsession with
results. The results are seperated in pages and have page1,page2,... in the
URL. Each page carries links I want to copy.
If I paste one result page into the URL-field in GUI and set up a filter -*
+*link=* it does exactly what I want with all links on this one result page.
If I paste more then one result page into the URL-field I get a lot of other
data I don't need. It seems to store all result pages several times ending up
in a huge amount of data and missing the subpages I need the most.
I tested the console Version with:
httrack "www.url.com/page2" -O "d:\test" "-* +*link=*" -v
and it goes completly wild. The -* seems to be ignored, maybe everything in
the parameter. | |