| >b) call an external compression program (zip, arj)
Can be easily done with -V option (-V "tar uvfz
foo.tar.gz \$0") under Linux - BUT in this example,
taring is really slow
>c) some other ideas
Using the uder-defined structure, like %h%p/dir%q/%n%
q.%t or %h%p/dir%t/%n%q.%t but I suspect this solution
to be useless, as it will generate too many
directories, too!
Large websites and filesystem limit is a problem,
especially when getting a large number of files in a
common directory (the worst thing is those
foo.cgi?page=1,foo.cgi?page=2,...foo.cgi?page=999999)
| |