| Hello,
I have read all the manual but I still am not getting the
desired results.
All I want to do is to copy an entire website domain name
(eg. www.abc.com ) AND only the first level links
(.html, .jpg, .pdf, whatever...) to any other websites.
That's all I want.
There are too many such links for their corresponding
websites to be specified in the scan rules.
I have been trying for hours and I can't make it work
correctly. I set 'external limit =1'.
Do I keep the default scan rules??? ( +*.png +*.gif +*.jpg
+*.css +*.js -ad.doubleclick.net/* )
Do I have to add +www.abc.com/* in the scan rule???
Any help would be much appreciated.
Thanks,
Maz
| |