Re: CURLOPT_WRITEFUNCTION function getting called multiple time even the complete http response body is only 2155 bytes
Date: Fri, 19 Jun 2015 18:52:34 -0400
On 6/19/2015 7:04 AM, Ganesh Nikam wrote:
> The problem I am facing with curl is that, my write_callback is
> getting called multiple times for single http response. As per the
> documentation this callback can be called multiple times in some
> cases. My query is in what scenarios it will be called multiple times
> ? How curls decides it? Here is the text from documentation:
> /"The callback function will be passed as much data as possible in all
> invokes, but you must not make any assumptions. It may be one byte, it
> may be thousands. The maximum amount of body data that will be passed
> to the write callback is defined in the curl.h header file:
> CURL_MAX_WRITE_SIZE (the usual default is 16K)".
> Some queries on this point:
> 1. What do you mean by "as much as possible" here ? how libcurl
> decides it ?
> 2. In my case, my complete http response boby is of 2155 bytes. For
> such small response also libcurl breaks the message in two callbacks.
> First with 1295 byte and other is 860 byte. What could be the reasons for
> calling callback two times for such small message size? My server
> response is not going to be more than 5K.
> 3. I want to get the complete http response in single write_callbck
> function. But as per below thread it is not possible:
> I have implemented wrapper class over the libcurl api. My
> applications requirement is that, this wrapper class should return
> complete http response to it. Now I would like to collect the multiple
> chunks (received
> from multiple write_callback function calls) together. For that I
> want to know he complete size of the http response. is there is any
> way to get the complete size of the http response ? I checked
> easy_get_opt flags, but
> I didn't find any relevant flag. Can you please suggest any way to
> achieve this ?
> My http server sends response in chunked encoding. is there a way
> to get the chunked response size ?
> libcurl version is 7.40 and the platform is Ubuntu 12.04
libcurl passes a received chunk of (parsed) data up to
CURL_MAX_WRITE_SIZE to your write callback. As noted in the
documentation do not make any assumptions. "as much as possible" does
not necessarily mean all received raw data even if that data is less
than the max due to parsing. This is true in your case and other cases
(like IMAP body I think may be passed to the write callback line-by-line).
For example you make a request and get a reply like this that all
arrives at once:
HTTP/1.1 200 OK
libcurl in readwrite_data will call Curl_read to read the raw data 
and it gets that whole thing. First it needs to handle the headers so it
goes into parse-the-header-mode and then after that it can determine how
to handle the body. In this case the encoding is chunked (as determined
by the headers) and it will start to parse the body at the first '3' by
parsing the chunks, which it does in Curl_httpchunk_read, which actually
also writes away the data . So:
The first time the write callback is called with data "FOO", len 3
The second time the write callback is called with data "BAR", len 3
You do not need to know the complete body size to collect the multiple
chunks together. As you receive the response write it somewhere, eg to
memory . As you can see in that memory example if you need more
memory just reallocate. Considering your performance requirements you
may have to do some adjustments.
As to your question on content length, chunked encoding is used when the
server does not know the length of content it will be returning at the
time it starts the reply, but even if it does the content-length header
isn't allowed. In other words you cannot get the complete size of the
body until the transfer is complete. The only way that may be possible
is in the most unlikely of configurations if your server sends a special
header like content-length (but not content-length) but I don't know why
anyone would do that (unless it was sending a really large file or
something? still seems weird though).
List admin: http://cool.haxx.se/list/listinfo/curl-library
Received on 2015-06-20