HTTrack Website Copier
Free software offline browser - FORUM
Subject: Different behavior GUI / console and other things
Author: ET
Date: 01/12/2016 22:19
 
I'd like to copy a site were I put out a search request and get a jsession with
results. The results are seperated in pages and have page1,page2,... in the
URL. Each page carries links I want to copy.

If I paste one result page into the URL-field in GUI and set up a filter -*
+*link=* it does exactly what I want with all links on this one result page.

If I paste more then one result page into the URL-field I get a lot of other
data I don't need. It seems to store all result pages several times ending up
in a huge amount of data and missing the subpages I need the most.

I tested the console Version with:
httrack "www.url.com/page2" -O "d:\test" "-* +*link=*" -v
and it goes completly wild. The -* seems to be ignored, maybe everything in
the parameter.
 
Reply


All articles

Subject Author Date
Different behavior GUI / console and other things

01/12/2016 22:19
Re: Different behavior GUI / console and other things

01/16/2016 16:23




d

Created with FORUM 2.0.11