cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: HTTP PUT and chunked transfer

From: Daniel Stenberg <daniel-curl_at_haxx.se>
Date: Wed, 22 Dec 2004 10:26:58 +0100 (CET)

On Tue, 21 Dec 2004, Andreas Harth wrote:

> aharth_at_deri-swc01:~$ curl --version

> curl 7.12.2 (i386-pc-linux-gnu) libcurl/7.12.2 OpenSSL/0.9.7e zlib/1.2.2
> libidn/0.5.2

And this fails when you try the command line I used?

> Is there a way to specify the size of the chunks?

No, not with the command line tool. If you write your own app with libcurl,
you can do that easily - since libcurl sends a "chunk" out of every single
buffer it gets returned when calling the app's read callback function.

> I'd like to commit the data that is sent/received to the database every say
> 10 MB. The problem is that if I sent a 400 MB file and don't commit to disk
> once in a while during transmission I get OutOfMemory errors.

1. Why do you need chunked encoding for this? Can't you just save data every X
    bytes received?

2. Receiving huge posts in memory only seems a bit naive!

3. You can be sure that libcurl will never send chunks bigger than 16KB, as
    that is the maximum size of its internal buffer used for upload and that is
    then the maximum chunk size it uses. Unless we change the buffer size in a
    future release of course. I don't want to promise that we won't ever do
    that.

-- 
      Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
       Dedicated custom curl help for hire: http://haxx.se/curl.html
Received on 2004-12-22