HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Multithreaded page downloading
Author: Xavier Roche
Date: 09/24/2013 20:24
 
> Hi im trying to download a large number of html
> pages(800K) to be exact and i have tried adding the
> list of 800k urls to httrack and it begins to work
> but it seems to go through the urls and download

Humm, you may try to add -%N0 (disabled type checks) in the scan rules to
speedup the process. But httrack will also have troubles beyond 100k-URLS ;
you may also add -#L10000000 to bypass this limit. In all cases, 800kURLS is a
bit big for a small-scale program like httrack :)
 
Reply Create subthread


All articles

Subject Author Date
Multithreaded page downloading

09/22/2013 23:40
Re: Multithreaded page downloading

09/24/2013 02:21
Re: Multithreaded page downloading

09/24/2013 04:14
Re: Multithreaded page downloading

09/24/2013 04:15
Re: Multithreaded page downloading

09/24/2013 07:45
Re: Multithreaded page downloading

09/24/2013 20:24
Re: Multithreaded page downloading

09/24/2013 23:39
Re: Multithreaded page downloading

10/02/2013 00:07




e

Created with FORUM 2.0.11