curl-users
Building recursive list of URLs, no page downloading
Date: Wed, 29 Jan 2003 22:52:16 +0000 (GMT)
I need to spider thousands of URLs at our company's websites to see how
many URLs there are there for the move to our new CMS.
Access to some sections of the websites are user/pass restricted and
authentication is performed through cookies, not standard HTTP/auth. so it
is essential that I can load cookie/s into this program.
Also, I do not need to actually download the URL, just note its URL and
move onto the next URL linked from the first page.
Not sure whether there is a way in curl, or perhaps wget?
Thanks for any suggestions,
David
-------------------------------------------------------
This SF.NET email is sponsored by:
SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!
http://www.vasoftware.com
Received on 2003-01-29