curl-library
batched write without control over future data availability
Date: Tue, 11 Oct 2011 14:09:52 -0700 (PDT)
Greetings.
(Summary: How to use libcurl where we do not have complete data upfront. But callback is not the venue to get future data.)
I am working on an Adapter pattern: one side with simple file read/write semantics without seek. The other side is libcurl based.
In other words, calls to write to a new 'file' start with a simple create() call. Subsequently:
Write() does the actual data transfer: of writing to index 0, and then subsequent calls are assumed with sequential index.
until a close() is issued. Please note, there is no file pointer to read data from. Data comes in a buffer, of some uncertain size.
Steps:
create('file_name').
write(index 0, size 100) ==> Adapter ==> Curl (Put) operation
write(index 100, size 200) ==> Adapter ==> Curl (Put) operation.
write(index 300, size 10) ==> Adapter ==> Curl (Put) operation.
close();
So how can subsequent curl_easy_perform() work on the same stream? Because when the callbacks get to
the end of 100 bytes in first write() above, it should return a 0. And then would imply an end of stream to Curl.
Should there be a reset of headers to present some kind of 'continue' response?
Thanks,
-CS
ps: Google searching for this scenario became difficult, as I did not know what exactly to ask for.
Did go through some examples. Apologies if this is a repeat question.
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2011-10-12