| I am trying to scrape a UBB Threads site which uses PHP
generated pages. This program downloads every file that it
parses to 1 directory. The site is HUGE and has several
thousand posts/threads so putting all of the files in 1
directory makes it difficult to navigate and expand/view
in Explorer. Any way to force a new directory to be
created and still have all of the links stay in tact after
a certain file size is reached?
Thanks,
Rob | |