HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Duplicate URLs when copying Mediawiki-based page
Author: WHRoeder
Date: 02/12/2013 15:31
 
1) Always post the ACTUAL command line used (or log file line two) so we know
what the site is, what ALL your settings are, etc.
2) Always post the URLs you're not getting and from what URL it is
referenced.
:
> httrack --update --priority=7 --display=2
> --search-index --can-go-up-and-down --retries=10
> --do-not-generate-errors --keep-alive
> <http://DOMAIN/index.php/Main_Page>
> -"DOMAIN/*/Special:*"
> +"DOMAIN/*/Special:SpecialPages"
> +"DOMAIN/*/Special:AllPages"
> +"DOMAIN/*/Special:ListRedirects"
> +"DOMAIN/*/Special:Statistics"
> +"DOMAIN/*/Special:Version"
> +"DOMAIN/*/Special:WhatLinksHere*" -"*?*"

So what URLs are you getting that are duplicates?
 
Reply Create subthread


All articles

Subject Author Date
Duplicate URLs when copying Mediawiki-based page

02/10/2013 23:51
Re: Duplicate URLs when copying Mediawiki-based page

02/11/2013 03:34
Re: Duplicate URLs when copying Mediawiki-based page

02/12/2013 00:36
Re: Duplicate URLs when copying Mediawiki-based page

02/12/2013 15:31
Re: Duplicate URLs when copying Mediawiki-based page

02/16/2013 00:09
Re: Duplicate URLs when copying Mediawiki-based page

02/21/2013 17:46
Re: Duplicate URLs when copying Mediawiki-based page

07/08/2013 18:12




e

Created with FORUM 2.0.11