HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: File listings
Author: Xavier Roche
Date: 11/01/2003 11:19
 
> I am trying to copy a site with lots of files. It is using
> the directory browsing function of a web server and it 
gives
> me pages with heading so I can sort the page on file name,
> file size, file date etc.
> The problem is that every page generates several 
index.html
> files, each one corresponding to a different sort option.
> This is generating a lot of overhead and unnecessary files
> since I am only interested in the meat, ie., the files in
> the directories.
> How can I avoid this?
Use scan rules ; such as:
-www.example.com/*listing?key=* +www.example.com/*listing?key=size

(here all links such as www.example.com/..listing?key=.. 
will be forbidden, except those with key=size)
 
Reply Create subthread


All articles

Subject Author Date
File listings

10/31/2003 08:59
Re: File listings

11/01/2003 11:19




b

Created with FORUM 2.0.11