curl-library
Re: Size of chunks in chunked uploads
Date: Fri, 1 May 2009 20:54:56 +0200 (CEST)
On Fri, 1 May 2009, Apurva Mehta wrote:
> (0) Doesn't the read callback accept as arguments the maximum size it is
> allowed to copy into the buffer? How is it then possible to have the read
> callback send larger or smaller values (and so control the chunk size)?
You can respond with less data than what it asks for.
So if your callback today only responds a certain amount of data that is less
than what libcurl asks for, you can make it return larger or smaller.
> (1) What about the command-line curl utility? I notice that when I use it to
> upload large files using chunked encoding, the server receives 128 byte
> chunks. For the same file uploaded to the same server without chunked
> encoding, the server receives the data in 4000 byte segments. (This is an
> apache webserver and a I get these numbers because I have a custom apache
> module handling these uploads.) This is what lead me to believe that there
> is some implicit default value for the chunk size.
I have no explanation for that. 'curl' has the exact same read function for
doing the read from a file so it should provide data in the exact same
way/pattern independently of chunked or not.
-- / daniel.haxx.seReceived on 2009-05-01