HTTrack Website Copier
Free software offline browser - FORUM
Subject: Large website - one folder
Author: Trev
Date: 06/20/2004 20:17
 
I have checked the forum for information on how to prevent 
a large website from being put into one directory but 
rather place them into separate smaller sized directories. 
especially since it appears that Windows must have a 
threshold maximum number of files in one directory. (approx 
21000).

I have read in the forum on how to use the build options to 
separate files in different folders. Like the example from 
one of the threads in the forum below: 


-------------------------------
Example:
<http://www.example.com/images/image.cgi?sect=flowers&id=45>
<http://www.example.com/images/image.cgi?sect=animals&id=98>
..

Then, use:
%h%p/%[sect::/::]%n%[id:-:::].%t

(note: the %[param:before:after:notfound:empty] syntax will 
include the 'param' query string parameter, 
prepending 'before' and appending 'after' if found, and 
using 'notfound' and 'empty' as replacement if the 
parameter could not be found, or was empty)
---------------------------------

The param function appears to work very well when their are 
at least two query strings within the url. but what if 
there is just one.  Is it not possible to separate?
ie. <http://www.example.com/text/details?id=25983>

with only one query string, it would all still try to go 
into the same directory. 
Path : www.example.com/text/details/id25983.html

is there any way to split the files in the "details" 
directory further to overcome Windows limitation of the 
maximum number of files for one folder?  
 Say the first 20000 in folder "details1" and the rest 
in "details2". or by using the build options in another 
way, that I haven't thought of?

Trev
 
Reply


All articles

Subject Author Date
Large website - one folder

06/20/2004 20:17




c

Created with FORUM 2.0.11