curl-library
Re: Large Post data.... buffering?
Date: Wed, 11 Aug 2004 15:27:43 -0400
> Another option would be to enforce chunked-encoding, but that would
> require that the server understands it (HTTP 1.1) and will add a
> slight overhead on the network traffic.
>
thanks for the option.
>> 4) I close my FILE* after the transfer completes.
>>
>> This seems to work perfectly... could you just confirm the approach?
>> It's not clear from the Docs that the FILE* read mechanism uses
>> buffering.
>
> I'm not sure what you mean with "buffering" here. This approach will
> make libcurl load the data to upload from the given file. It will not
> (and cannot) load the whole file at once. It will in fact load
> CURL_MAX_WRITE_SIZE bytes at a time from the file into the internal
> buffer and upload it.
>
Yes that's exactly what I mean -- io buffering on the client side. The
reason I ask is that we're using libcurl via CURLHandle and until I
just mod'd it to use the FILE upload, you had to read the entire file
memory and set it as the post string. This was of course a CURLHandle
limitation which has now been fixed.
> Thanks. Based on your input, I edited the description for CURLOPT_POST
> in the curl_easy_setopt man page. Hopefully it'll be a bit clearer
> now.
Cool! Glad I could help and I hope it helps others.
Thanks again,
Alan
Received on 2004-08-11