curl-library
Re: cookie jar thoughs
Date: Wed, 29 Aug 2001 07:18:51 -0400
> 1. "back then" there was a suggestion flying around that we could support
> some kind of callback system to allow applications *not* to store the
> cookies in a file, but instead keep them in memory for faster accesses
> when doing many operations. Anyone with a bright idea on how that
>would work?
I think i have arrived quite late at the scene
but let me put my thoughts down
Possibly we can have a hashtable into which we can load the file on start up may
be in the globalinit function and again rewrite back into the file on
global_cleanup.
All operations at run time would be done on this table.This table is going to be
active through the lifetime of the application(atleast till we invoke
global_cleanup)
irrespective of the no of curl sessions created with all the sessions sharing
this table
But again this needs a locking mechanism which i feel (many have already pointed
out) be left to the caller .May be we can give some default locking functions on
some platforms and leave the rest to the developer on that platform
Regards
Bhararth
Daniel Stenberg wrote:
I've started fiddling with the cookie issue brought up on the curl mailing
list back in June [1] that I've tried to summarize before [2].
A central point in this new funcionality would be libcurl's added
ability to
*write* cookie files, using the Netscape/Mozilla format [3], as
this makes it
possible to use (lib)curl in a whole range of new uses that
currently would
involve an additional layer of cookie parsing.
But as we're talking about a library here (command line tool talk
will come
later in the other list), we need to straighten out some tiny
quirks first!
;-)
1. "back then" there was a suggestion flying around that we could
support
some kind of callback system to allow applications *not* to
store the
cookies in a file, but instead keep them in memory for faster
accesses
when doing many operations. Anyone with a bright idea on how
that would
work?
2. This system would make libcurl work with an entire collection
of cookies,
I figure many people could end up using their actual set of
cookies in
~/.netscape/cookies (and similar). Then, how are the libcurl
programmer
gonna be able to control which cookies that are actually used
in the
requests (or possibly the reversed, which are not gonna be
used), and
what cookies that are saved again in the file? We'd need some
kind of
filtering ability, don't we?
Both these issues can be left to solve later, as they don't
necessarily
change the basic functionality.
[1] = http://curl.haxx.se/mail/archive-2001-06/0092.html
[2] = http://curl.haxx.se/dev/cookie-jar.txt
[3] = http://www.netscape.com/newsref/std/cookie_spec.html
--
Daniel Stenberg -- curl groks URLs -- http://curl.haxx.se/
Received on 2001-08-29