| Hello. I work for the University of Missouri (in a very small, underfunded
department) that is currently doing a study on certain kinds of "citizen
journalism" websites. In order to do the project properly, we need to capture
just the first page or two of approx. 450 sites on a daily basis for a week.
I'm what passes for our IT type here. :) I've used HTTrack before at home for
grabbing websites, but never so many at once.
Is there any reason to believe that HTTrack won't be able to grab that many
different websites in one project, considering there will only be one or two
pages per website captured?
That is, I know there is a maximum number of total links that can be set, but
what about a maximum number of ROOT WEBSITES?
I have searched, read the FAQ, etc. but didn't see anything related to this.
Thanks for your help! | |