HTTrack Website Copier
Free software offline browser - FORUM
Subject: Struggling -syntax to limit depth of search
Author: Andrew
Date: 07/31/2013 16:27
 
Hi

HTTrack is working nicely for me, except I cannot craft the necessary rules to
allow just the files I want to be downloaded.

I've tried a few permutations with different results, but I get insufficient
or too much back and I don't want to download the entire site as it's very
big.

I'll try to explain the layout of the site in the hope that someone more
knowledgeable is kind enough to help.

I'm looking to start getting the files from
example.com/subdir1/subdir2/projects

The html at that location is a paginated set of results.  Each item on the
page has a link to download the project files - I want selected filetypes from
the projects on the starting page.

However, these files are located at
example.com/subdir3/subdir4/files


All the rules I've tried to create so far download too much from
example.com/subdir3  which has many more subdirs than I'm interested in (I
only want those subdir4s mentioned at my starting URL, and it's subsequent
paginated pages).

Is there any way to be this specific with httrack, if so, could I be pointed
in the direction of a solution please?
Any help appreciated

Andrew


 
Reply


All articles

Subject Author Date
Struggling -syntax to limit depth of search

07/31/2013 16:27
Re: Struggling -syntax to limit depth of search

08/02/2013 15:29




b

Created with FORUM 2.0.11