HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: File listings
Author: Xavier Roche
Date: 11/01/2003 11:19
> I am trying to copy a site with lots of files. It is using
> the directory browsing function of a web server and it 
> me pages with heading so I can sort the page on file name,
> file size, file date etc.
> The problem is that every page generates several 
> files, each one corresponding to a different sort option.
> This is generating a lot of overhead and unnecessary files
> since I am only interested in the meat, ie., the files in
> the directories.
> How can I avoid this?
Use scan rules ; such as:*listing?key=**listing?key=size

(here all links such as 
will be forbidden, except those with key=size)
Reply Create subthread

All articles

Subject Author Date
File listings

10/31/2003 08:59
Re: File listings

11/01/2003 11:19


Created with FORUM 2.0.11