HTTrack Website Copier
Free software offline browser - FORUM
Subject: Any rationalization or intention to fix this?
Author: Ingomar Wesp
Date: 01/22/2009 14:33
 
I hate to be such an annoyance, but I would really like to know whether the
behavior I outlined in the previous post is intentional or if it is bug.

In my personal opinion it doesn't make much sense to exclude files at depth 0
via mime-filters (except for some scenarios where input comes from dumb
script), whereas it makes a whole lot of sense to apply mime-filters for all
files with a depth greater 0 only.

Consider the following scenario. Some CMS generating URLs like
<http://foo.bar/?id=1234> for all the files it serves. There is absolutely no
way you could use a URL filter to restrict certain file types from being
downloaded in a meaningful way. However, if mime filters wouldn't apply to
depth 0 URLs and you wanted to download a specific page (will all images, CSS
files, CSS files included from CSS files etc) you could just use
'-mime:text/html' combined with an arbitrarily high depth.

So, am I the only one who thinks that would make for a fantastic feature / bug
fix?
 
Reply Create subthread


All articles

Subject Author Date
MIME filter and URLs explicitly passed on cmdline

01/20/2009 16:55
Any rationalization or intention to fix this?

01/22/2009 14:33




3

Created with FORUM 2.0.11