curl-users
Re: ANNOUNCE: curl 7.8
Date: Thu, 7 Jun 2001 20:24:18 +0200 (MET DST)
On Thu, 7 Jun 2001, Pierre Z. wrote:
> I've downloaded curl 7.8 Windows binary (Win32-SSL) this morning and it
> works fine on my NT laptop - AND: it has resolved the Relocation issue I
> mentioned to you earlier (the one that caused "infinite loops" when
> trying to fetch secure documents).
Whoa! :-) One down, a thousand more to go...
> One question: because accessing specific data on that secure server
> requires going thru a series of pages, and to reduce the number of
> connections (I'm not sure about persistency) I'm wondering if there's any
> way to go thru those pages in one single cURL command line (therefore
> passing the headers/cookies along each new page accessed...). I'm using
> files to receive/transmit cookies (not "Name=Value" syntax)
Cookies will be (parsed, understood and) kept in memory and used correctly
for all subsequent URLs specified on the same command line. If you use -b to
activate the cookie "engine" that is.
> I know one can pile up multiple URL's in a single line (and save to
> multiple files), however can you use multiple "-b" (send headers to
> server) and "-D" (save headers from Server) combinations on one command
> line to make sure the cookies keep updated and transmitted from and to
> the Server while cURL visits each page in the sequence? How would you do
> this?
If you're concerned about cookies only, they should just work!
Otherwise, there really isn't any really good way. If you want to do magic
things, you need to use the magic tools: go with a script language and its
libcurl interface. Like perl or python or something.
-- Daniel Stenberg -- curl dude -- http://curl.haxx.se/Received on 2001-06-07