| Good morning dear developpers,
I've recently downloaded the current version of httrack.
I use it mainly to copy files from internet sites.
For example some websites that don't lock directory listing sometimes contain
mp3, avi or jpg files.
Very often the directory tree contain more than 3 levels. Since a certain
amount of files, it seems impossible to launch all downloads manually and also
very hard to rebuilt the directory tree.
It would be nice to create in httrack some functions seeming to filezilla.
What happened to me is that i had only 24Gb left on my laptop. I thought it
would be enough to copy a website with unlocked directories. I had to stop
this httrack operation by lack of space. I transfered only the directory
structure that httrack had grabbed for me. I left other things like hts-cache
directory, .lock file, .txt log file.
Then I tried to 'resume' my downloads.
With filezilla it is a common operation to realize. It has an option called
"file exists action". It allows you to be asked for what to do, skip,
overwrite, rename or resume.
It has been impossible for me to find an easy solution to restart my
downloads. Httrack always redownloaded existing files.
Finally i had to manually create an exception rule for each directory i
thought okay. VERY boring.
I notice that httrack knows the file size at the moment the download starts.
It would be nice to detect if a same named file exists and compare the sizes.
I don't expect a resume option even if it would be great. But just a skip
option for large files when the size is identical.
I also discovered that some directories contain uncomplete files and just near
the same file named '.tmp' also uncomplete. I have now to manually perform
these type of download.
With filezilla, when i have some doubt i just reload all the directories
specifying the resume option. And some failed files get updated. It's a fast
operation.
Another cool option would be to be able to setup the speed limit manually. I
perfectly understand the non-abuse policy. But 25kb/sec is very slow,
100kb/sec seems better, and the result you get with 'disable security limit'
is a manifest abuse if your connection is a high speed one.
I still think it is a powerful tool because one of my last session was of 5000
jpg files. It wouldn't be easy to perform it manually. Just think that it's
only half of the work if you have then to check all directories for uncomplete
files.
Thank you.
| |