cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: Very big file upload via HTTP POST

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Sun, 28 Jul 2002 19:23:52 +0200 (MET DST)

On Thu, 11 Jul 2002, Jens Viebig wrote:

> I'm using libcurl 7.9.8 to do automated HTTP multipart posts. I have to
> transfer very big files > 100 MB. Posting smaller file sizes works fine,
> but posting big files busts up my memory. I looked in the sources of
> libcurl but it was very hard to follow how forms are processed.

Yeah, it is a bit involved.

> I think I found out that all data of the form ist first buffered in memory
> and then posted to the server. Is this right ?

That is indeed correct.

> It would be nice if libcurl would do this in chunks. Not reading the whole
> file at once. Perhaps this is alredy implemented and i missed the right
> options. If not, can you point me to the right functions, so i can try to
> implement chunked reading behaviour.

Ok, all the logic you're after is put in lib/formdata.c.

The function called 'Curl_getFormData' is what creates the full multipart
post chunk that is later passed to the server by having the "read" function
point to the 'Curl_FormReader' function that for each invoke returns a chunk
of data to pass to the server.

It can indeed by turned into "chunked reading", but the first function would
still need to get the size of all the files to read as libcurl must know the
full size of everything already when issuing the request.

-- 
  Daniel Stenberg -- curl groks URLs
-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
Received on 2002-07-28