| I have made several attempts to download a site however it is not working at
all.
What I am trying to do is download a site, maintain its structure and have all
the files displayed on the pages downloaded. I have been trying to get the
program to save the pages like they would be saved through a browser.
There are some pages on this site I would like to have:
<http://62.5.189.98/yamahaeur/exist/moto/>
I would like to save this page:
For example, I want to save:
<http://62.5.189.98/yamahaeur/exist/moto/Motorcycle%201986-1992/DT125/2AJ%20%28ESP%29%201987%20%5B999%5D%20272AJ-352S1/>
so it is saved as:
c:\My Web
Sites\62.5.189.98\yamahaeur\exist/moto\Motorcycle%201986-1992\DT125\2AJ%20%28ESP%29%201987%20%5B999%5D%20272AJ-352S1\
with the following pages saved as:
\cilindro
\ciguenal. pistin
\water pump
and each folder with the contents of that page.
I have managed to get this far but instead of getting all of the files in that
page, I am only getting index.html in each folder.
When I save, say /clindro/ in Firefox, Firefox gives me:
c:\cilndro.htm
c:\cilindro [folder]\*all of the files for that page*
In short, how do I get HTTrack to save sites like:
www.site.com/page/
to
c:\site\
c:\site\page.htm
c:\site\page\javascript.js
c:\site\page\image.jpg
?
Regards | |