| I'm trying to download a website but I don't know how much bandwidth the
website has as it's small website. The pages and images are 99% static.
With that being said, I want to download small chunks at a time, day by day.
Is there a way to skip already downloaded parts?
For example, yesterday I downloaded 100 images in a directory that has 1000.
How do I prevent the program from downloading those 100 images again.
I don't see an option to do that.
Thanks,
Bill | |