HTTrack Website Copier
Free software offline browser - FORUM
Subject: Crash while loading a large URL list from a file
Author: Kevin
Date: 12/02/2008 03:48
 
I'm trying to run httrack on Windows XP from the command line, loading the URLs
to be downloaded from a file. I have everything working as designed on a small
list of 5 links. When I increase the size of the file to full size (1112
URLs), httrack crashes.

Here's my command line, as if it were executed from a batch file:

httrack.exe -%%L links.txt -O F:\download\project -r1 -%%e0 -%%c0.1 -c1 -R3
-%%P0 -N1 -K4 -j0 -s0 -%%F "" -%%l "en" -z -I0 -f2 -d -q

The log file says:
21:31:38	Info: 	engine: init
21:31:38	Info: 	1112 links added from F:\Dev\Bytewerx\Leads\scripts\links.txt
21:31:38	Debug: 	Cache: enabled=1, base=F:\download\project\hts-cache\, ro=0
21:31:38	Debug: 	Cache: size -1
21:31:38	Debug: 	Cache: no cache found in F:\download\project\hts-cache\

So, it looks like it's actually loading all of the URLs. What it does after
that, I don't know.

If I attempt to debug the crash, Visual Studio says: "Unhandled exception at
0x1006ccd0 in httrack.exe: Access violation reading location 0x00b0ffff."
Unfortunately, this is a release build, so I can't get anything but assembly
code. (I'd like to avoid building the project tomorrow, if I can. I'd have to
convert everything to VS2008. Ick!)

Has anyone ever seen this problem before? What can I try to fix it (besides
chopping up my input file into smaller chunks)?
Thanks,

Kevin
 
Reply


All articles

Subject Author Date
Crash while loading a large URL list from a file

12/02/2008 03:48
Re: Crash while loading a large URL list from a file

12/02/2008 16:01
Re: Crash while loading a large URL list from a fi

12/03/2008 15:03




d

Created with FORUM 2.0.11