> I'm using linux command line httrack to ignore files
> I don't need.
> Code: #httrack "www.readynas.com/download/GPL/" -O
> "./www.readynas.com" -R5
> H0Ko0s0zZd "+*.readynas.com/*" -*.zip -*.bz2 -*.tar
> -*.gz -v
> Files in the www.readynas.com/download/GPL dir
> download correctly. However, files in subdirectories
> below this level seemed to be cloaked. They can be
> accessed by my browser, but not by HTTrack.
Because you restricted it to 5 levels down (-R5)
> If I download a files from one of the directories
> directly, the link is reported as
I get forbidden on that link.
> I created a cookies.txt and disabled the robot.txt,
HTT creates it own cookies.txt. Don't mess with it insides.
Did you look at robots.txt? Why did you disable that? Irrelevant.