| Ok I searched the database here and maybe I found an
answer but let me know if I'm not...
Look at the post called "Quickly extract only valid urls
from hts-log.txt ?" from 09/02/2002 (or 02/09?)
That gave me an idea... Is there a way to just scan
(keyword: SCAN) 1 deep and then got to a hts logfile that
showds all the links on that page.
If this works then I could for example ascan the ebay
auction results page. Go to the list of links, and delete
all the ones I don't want. Then make this my URL list and
go 1 deep. Will this work? I'm gonna try it.
Anyways, I thought about the web address capture last
night. I think the ultimate would be to open up a website
in a special browser and then click and drag a box around
a list of links and then be able to output JUST THOSE
LINKS into a text file. This would enable you to go to a
site with a list of links all in a row that you would want
and skip the links at the bottom, top, and side of a page.
This is asking for something really nice, but if what I
described above doesn't work, then just being able to get
ALL links from a page would be nice. I can go back and
edit out the links I don't want.
Joe,
P.S. Great tool... Now that I've used your forum, I am
inclined to send some money soon... do you take paypal? If
not, let me know and I will refer you. | |