HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: Too Many Urls - Giving Up
Author: JP
Date: 09/13/2024 17:29
 
I've tried a few options and etc but I'm still getting the Panic error.   I'm
using the command line via MacPorts because I'm on a Mac desktop.

I just want to capture a website and all the images but traverse the links if
it goes outside the domain.  I still want the links to be clickable.

The domain has multiple subdomains that I wanted captured as well.

What's the correct command line?     I'm currently running it with 20million
links limit but it is capturing far more than I want it to.  It would be nice
if it logged all the links into a file as well.

Here's the command line:
httrack <https://(url>) 0 logfiles Y %e3 n q --advanced-maxlinks=20000000 
 
Reply Create subthread


All articles

Subject Author Date
Too Many Urls - Giving Up

05/17/2021 01:26
Re: Too Many Urls - Giving Up

05/20/2021 00:00
Re: Too Many Urls - Giving Up

01/16/2022 22:49
Re: Too Many Urls - Giving Up

09/13/2024 17:29




0

Created with FORUM 2.0.11