|
If you want links available off-line
We need to alter the filter
Try downloading "all sites in page (multiple Mirror)":
1. Put 2 desired URLs in Web Addresses for project
<http://www.cert-la.com/calendar2/calendar.cgi>
<http://www.cert-la.com/calendar/calendar.cgi>
2. Option>>Scan Rules to include your domain's events
-*
+www.cert-la.com/calendar2/calendar.cgi
+www.cert-la.com/calendar/calendar.cgi
+www.cert-la.com/calendar/calendar.cgi?view=Event*
+www.cert-la.com/calendar2/calendar.cgi?view=Event*
-www.cert-la.com/calendar2/calendar.cgi?view=Search
-www.cert-la.com/calendar/calendar.cgi?view=Search
-www.cert-la.com/*/calendar.cgi?template=login.html
-www.cert-la.com/calendar2/calendar.cgi?calendar=*
-www.cert-la.com/calendar/calendar.cgi?calendar=*
-www.cert-la.com/calendar2/calendar.cgi?year=*
-www.cert-la.com/calendar/calendar.cgi?year=*
-www.cert-la.com/calendar2/calendar.cgi?view=Month
-www.cert-la.com/calendar/calendar.cgi?view=Month
-www.cert-la.com/calendar2/calendar.cgi?style=*
-www.cert-la.com/calendar/calendar.cgi?style=*
3. Option>>Limits>>mirror depth = 1 internal ONLY
(External=0)
This should download all calendar events
(I'm excluding non-events because they link to themselves)
(In a link follower, that can lead to infinite loops)
This will still lead to downloading a LOT of pages
(I think 10 pages more than the # of events)
(I am erroring on the side of downloading less, here) | |