| I don't know if an example is needed but if you see the website I want to
download celebrity.december1975.com (it's a college project) and one of the
links is <http://www.youtube.com/user/december1975utube> all I'm trying to do is
download the celebrity.december1975.com website and the december1975 page.
What seems to be happening is that the links on the youtube page are being
followed and added and downloaded. It's the same with other links too. As I
say, I set the external links to 1 and also changed robots to ignore (I tried
it with follow robots'txt file too, same result). | |