| Hello, I am trying to download a very large website. After running for hours
and hours I keep getting the message, Too Many Urls - Giving Up. And I have to
start the program again. I dont know if I am re-writing the same data over and
over or if i am capturing more. But is there a way around this. Can you pause
the cpturing of Urls until the program catches up so i dont have to restart
everytime?Thanks in Advance. | |