Was trying to download a website. Besides countless "Not found pages" hhtrack
ended up in an endless loop just downloading the same few pages over and over
but each time putting it in a subfolder. I ended up with over a 100 folder
depth of the same files(1GB+).
It was some .asp pages mainly. I tried to skip them but it didn't stop it.
It seems one needs a bit more intelligence in determining the files. Not only
not writing "not found" pages but not writing "useless files" or recursing
over the same data.
It would be nice if one could "pause" or "cancel" if something like this
endless loop happens without having to start over from scratch. |