| Yes, i know the download of "wikidump" but first i must install xampp (example)
with mysql on my pc and second, the greater problem ist the design in
wikimedia and the pictures, most of them are not included.
so i look for a possible way to start the "real" download from one ore more
sites on wiki parallel, but i told that winhttrack has a problem with this
site i give in first thread and i don't understand why winhttrack don't strat
the download.
i use with and without robots.txt, different depht, all kind of sites (if
unknown...) and many other things that can be change in the program.
So, what is the main problem on this site?thank you, thorsten | |