HTTrack Website Copier
Free software offline browser - FORUM
Subject: Re: HTTrack Problem
Author: Tobias
Date: 03/09/2016 17:08
 
-_-' Solved... just used notepad++ to add 1998 lines, each line a number from 1
to 1998... and then added in front of every number, on every line:
<https://lousysite.com/browse?page>= .. this made basically 246 links... copy
pasted all of them in main screen where to insert URL.... then added -* filter
and... +https://lousysite.com/hidden/alliance/mobsters/*.jpg.. -_-' Now ONLY
*.jpg are downloaded (except few mumbo jumbo other small files insignificant,
i assume those are from program or are a MUST so download is made)... -_-' So
simple -_-' I guess all <https://lousysite.com/browse?page=2..3...1998> ARE html
pages and that's why were denied in my attempts since i added -* (stupid me
didn't knew that -_-')... Hope others will find this useful and I don't
overload or trigger any thing else bad for server BUT is working and I got
what i want it! Cheers!
 
Reply Create subthread


All articles

Subject Author Date
HTTrack Problem

03/09/2016 15:24
Re: HTTrack Problem

03/09/2016 17:08




7

Created with FORUM 2.0.11