HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Crawling process only
Author: Xavier Roche
Date: 02/21/2004 10:41
 
> Is there a way using HTTrack just for a crawling process, 
> i.e. just getting a list of all urls (e.g. in a text 
file)?!

Set Options / Experts only / Primary Scan Rule : just scan
Links will be put in hts-cache/new.txt

If you also want external links, you'll have to code a 
small plugin to wrap fetched URLs. (see the httrack 
developper page, <http://www.httrack.com/html/dev.html>)

 
Reply Create subthread


All articles

Subject Author Date
Crawling process only

02/16/2004 11:03
Re: Crawling process only

02/21/2004 10:41




6

Created with FORUM 2.0.11