curl-library
Re: how libcurl receives chunked data from http server?
Date: Mon, 22 Sep 2003 23:11:54 +0200 (CEST)
On Mon, 22 Sep 2003, Jerry G. Chiuan wrote:
> Here, what is the purpose of receiving partial response data and then call
> write callback. why is it not designed to receive the whole response at a
> time? in single buffer?
Because that would A) take LOTS of memory, and B) prevent libcurl to be able
to download lots of files (due to the amount of memory being smaller than the
file size).
> Is it because libcurl expects application to copy partial data from buffer
> and the buffer can be reused by libcurl for rest of response data? ( by
> doing this, less memory would be consumed )
Yes, and to allow any-size data to be downloaded. There's often no point in
keeping the entire response in memory. And if you DO want to, libcurl still
offers you the ability to do so.
> I hope libcurl can receive the whole response data at a time, even in a
> single buffer, then my application doesn't need to copy partial response
> data from libcurl buffers to an extra buffer before dealing with it. ( I
> mean I c an avoid one more copy ).
Then you need to patch libcurl to make sure this happens. As I told you, you
cannot assume this will happen even if the remote file is smaller than the
buffer libcurl uses for downloads.
> If libcurl doesn't do that,
> - first, I need to copy every partial data to an extra buffer, reconstrcut
> them over there to be a whole response. ( response consists of several
> chunks )
> - secondly, copy "one more time" for each chunk from there to the buffer I
> have had in my application.
No, you only need to copy the data once. In the callback you provide.
> I hope I can avoid "fisrt time" copy!
You don't need to copy the data twice. You just need to write the code smart
enough.
> >You would add significant complexity to the code though.
>
> any risk if I do so? will it be really compicated?
You can and should judge that yourself, it's not a venture I want to explore
personally. Feel free to report here when/if you achieve anything noteworthy!
-- Daniel Stenberg -- curl: been grokking URLs since 1998 ------------------------------------------------------- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sfReceived on 2003-09-22