curl-users
memory leak goes, new perl interface comes
Date: Tue, 10 Oct 2000 18:05:57 +0200
Hi,
as the small memory leak in libcurl has been fixed (thanks for the quick
help!) i made a new version of my perl interface to libcurl. It's
available from http://koblenz-net.de/~horn/export/Curl.tgz
Registering callback-functions and passing lists for options HTTPHEADER and
HTTPPOST still does not work, but the options ERRORBUFFER, FILE, INFILE and
WRITEHEADER now work. So simple downloads (that's where i use curl for...)
can now be done.
I also added a function curl_easy_getinfo() that for now always returns
the number of bytes that have been downloaded (that's, for now, the
only thing io need to know after a download...). Later, libcurl shall
provide this function. (I implemented this by internally using a write-callback
that counts the bytes.)
While playing around with this stuff, i thought of the following:
(No criticism, just my thoughts...)
It would be nice to have a function curl_easy_open() that returns a
filedescriptor or a FILE * pointer, where one can read from for downloading
or write to for uploading. With the current libcurl, you can only write
to stdout or to a file, but in a program that uses the lib i usually don't want
to imitate the curl program, i want to read from a descriptor and immediately
process the read data, or generate the data from scratch and directly write it
to some filedescriptor to upload it somewhere. Currently i have to write to a
file, then reopen that file and process the data. Or i have to do "complicated"
fiddling with callback-functions...
Or can this be done with the lowlevel libcurl functions?
I also dont like the way errormessages are stored in a buffer that must be
supplied by the calling program. I would prefer that the curl_easy_*()
functions either return an errorcode or set some global variable, and that
an errormessage could be retrieved via a functioncall. Compare the errno
concept of C.
Bye, and thanks for listening ;-)
Georg
Received on 2000-10-10