| > For the near thing, this may a good idea to override
>
> depth using this option - I'll try to add that!
>
I've been needing this for some time -- it's really annoying that httrack
can't just fetch a page and all it's requirements in the way I expected with
-r1 -n. So I created this patch against 3.33.16, which appears to work. I
think it's better than Monty's change which is limited to certain file
suffixes, something that could be achieved already with filters. This one
makes the -n option work as I think most people expect it to when used in
conjunction with -r. I have not tested exhaustively but it seems to work for
me.
--- src/htswizard.c.orig Sun May 9 13:57:47 2004
+++ src/htswizard.c Mon Oct 17 11:28:59 2005
@@ -135,8 +135,10 @@
}
/* Niveau 1: ne pas parser suivant! */
+ /* 2005/10/17: Ben Wheeler <b.wheeler@ulcc.ac.uk>
+ * Made -n/--near override max recursion depth for non-HTML files */
if (ptr>0) {
- if (liens[ptr]->depth <= 1) {
+ if (liens[ptr]->depth <= 1 && (! opt->nearlink || ishtml(fil))) {
forbidden_url=1; // interdire récupération du lien
if ((opt->debug>1) && (opt->log!=NULL)) {
fspc(opt->log,"debug"); fprintf(opt->log,"file from too far level
ignored at %s : %s"LF,adr,fil);
| |