HTTrack Website Copier
Free software offline browser - FORUM
Subject: New feature suggestion: External Link Scan Rules
Author: Filer
Date: 07/24/2002 19:52
 
HTTRack could use a second set of scan rules for the "external depth" links:

What if I want +*website.com/* but from the links which are external from
anywhere in this site, I would like to get *.html and *.js and *.zip but NOT
get any files from any site that is external to these sites!

Now I think this cannot be done. Or have I misunderstood the way the "external
depth" works, does it only scan sites external to the original, but not any
sites external to the original externals? My inquiring mind needs to know.
Sorry I'm too lazy to test this out :-(

I wonder if this might lead to some kind of "scan rule scripting language"
being developed in the future to give people more possibilities to define what
they really want to get from where and how..just some conditional statements
would go a long way, like *.php->*.zip which would read "from *.php files scan
*.zip links", or something like that. Or e1..en describing the levels of
external links, e1 level being the first level out of original, e2 level first
out of first external etc, and writing scan rules separately for each new
external level.

Of course, a mad httracker can suggest more useful/less features than all
developers could ever progam :-)
 
Reply


All articles

Subject Author Date
New feature suggestion: External Link Scan Rules

07/24/2002 19:52
Re: New feature suggestion: External Link Scan Rules

07/25/2002 01:15
Re: New feature suggestion: External Link Scan Rules

07/25/2002 12:08
Re: New feature suggestion: External Link Scan Rules

07/26/2002 20:44




0

Created with FORUM 2.0.11