HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Downloading a Wiki?
Author: Ray
Date: 11/14/2009 19:46
 
Hmm. 

I can't find the robots.txt file, and the FAQ didnt seem to specify its
location, so I tried disabling the robots file completely through the options
menu.

While that sortof worked (things started downloading), I'm receiving the
actual php files instead of the html files generated by them. It would give me
some of the html files, but they would be named improperly, because of how you
get the contents.

Like; if the page is say
<http://avatar.wikia.com/wiki/Category:World_of_Avatar>, that redirects to an
index.php file with Category:World_of_Avatar as a parameter.

The result is a bunch of index.html files with numbers in the title, so you
can't actually find any of them. Is there some way to set it up so that
<http://avatar.wikia.com/wiki/Category:World_of_Avatar> would save as
Category:World_of_Avatar.html instead?
I heard mention of url hacks in another thread, would that work, and if so, is
there a guide on how to do that? and where the files to modify are?
Thanks,

~Ray
 
Reply Create subthread


All articles

Subject Author Date
Downloading a Wiki?

11/13/2009 17:04
Re: Downloading a Wiki?

11/13/2009 21:35
Re: Downloading a Wiki?

11/14/2009 19:46
Re: Downloading a Wiki?

11/14/2009 21:27
Re: Downloading a Wiki?

11/15/2009 17:00




0

Created with FORUM 2.0.11