HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Website images not being downloaded
Author: trakker
Date: 08/26/2020 15:33
 
I noticed that I ran the command with -u0 before, similar as on the other
topic, where you helped solving to remove it. And indeed this was the issue
here. 

Ran it without -u0 and the files are properly .html suffixed.
Also the images are there (the initial issue of the original thread creator).

I just noticed that the webserver seems to have low specs and my captured copy
shows on some pages a "hit resource limit" info displayed in the browser.

So removing the "speed improving" options, the new command line command should
look like this and be able to capture the stuff (it though will take more
time; with my previous settings it took about 3 hours and it's about 1GB in
size):

httrack <https://system16.com/> -v -s0 -F "Mozilla/5.0 (Windows NT 10.0;
Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169
Safari/537.36" -%B -%s -%v -i


@Matthew, hope that helps
@Nijaz, thanks again
 
Reply Create subthread


All articles

Subject Author Date
Website images not being downloaded

08/24/2020 14:26
Re: Website images not being downloaded

08/25/2020 14:21
Re: Website images not being downloaded

08/26/2020 00:18
Re: Website images not being downloaded

08/26/2020 00:21
Re: Website images not being downloaded

08/26/2020 15:33
Re: Website images not being downloaded

08/26/2020 18:52




c

Created with FORUM 2.0.11