curl-library
"from memory" chunked upload
Date: Wed, 7 Dec 2005 17:12:40 +0100
Hi, I'm currently trying to use libcurl to upload data from a buffer in
memory, in "chunks".
I've noticed the READFUNCTION and READDATA opt, yet my problem is the
following :
The buffer I wish to upload data from is alimented "chunks by chunks", thus
I may not have all the data inside when I start the upload. I would like to
upload what i got in the buffer, then "pause" the upload to insert new data,
then resume the upload. Is there any way I can have this behavior
(single-threaded) ?
It would go something like this :
--> feed my source buffer with some data
--> call the libcurl perform() that will call the READFUNCTION callback as
long as I have data in my source buffer, then pause (but not end the
connexion).
--> feed my source buffer with the rest of the data
--> call the libcurl perform() again to upload the new data.
--> tell libcurl that the transfer is completed.
I've read that "you can call curl_easy_perform() as many time as you want".
That's exactly what i would like to do, only I don't want the transfer to
end between each call, only when i say so.
Thanks,
Benjamin Garrigues
Received on 2005-12-07