curl-library
Re: Chunked encoding content in callback writefunction?
Date: Tue, 27 May 2003 16:58:36 +0200
Hi Daniel
Wow, that's a real fast response, thanks!
Yeah, I agree that the normal usage does not require that at all. I'm using
libcurl for a proxy connection and that's probably not the original idea of
it. And yes, I know that I can do proxy with other components but I really
like the curl approach for various reasons ;-)
I still think that it would be nice to have header and content data
consistent if you want to. Your advice about removing the transfer encoding
information from the header sounds good, I'll look into that. There might
be some issues regarding the content-length which is not known in the HTTP
header for chunked encoded contents, right? So I would have to calculate
the content length before I could reassemble the correct header. That would
mean that I have to cache all content before I can send back a response(?)
if I want to include content length in the header. Does libcurl cache all
contents anyway, even for huge downloads?
The HTTP 1.0 switch seems to work fine as the fallback workaround.
Thanks a lot,
Cyrill
--On Dienstag, 27. Mai 2003 16:36 +0200 Daniel Stenberg <daniel_at_haxx.se>
wrote:
> On Tue, 27 May 2003, Cyrill Osterwalder wrote:
>
>> I'm using the callback function to receive the content and process it
>> myself. It seems that libcurl automatically decodes the chunked encoding
>> and does not return the content in the "raw" form. In my program the
>> consistency between the received header and the content is crucial. The
>> received header still contains "transfer-encoding: chunked" and no
>> content length but the received content does not contain the chunked
>> encoding anymore.
>>
>> Is this the intentional behavior of libcurl
>
> It is.
>
>> or am I just doing someting wrong?
>
> Nope.
>
>> Is there a way to get the raw content of a transfer to process it
>> afterwards?
>
> No person before has requested this so we haven't made it possible, and I
> still can't really see the point of doing it.
>
>> I guess I can force HTTP 1.0 in order to get rid of the chunked encoding
>> stuff but that's only a workaround.
>
> That's indeed a work-around, but I wouldn't mind offering some kind of
> ability to get the RAW untreated data passed on to the application.
>
> Can't you just hide the "chunked" header info from whatever is expecting
> the chunked data? Then you won't need to duplicate the same code that
> libcurl already offers. It seems silly to me.
>
> --
> Daniel Stenberg -- curl: been grokking URLs since 1998
>
>
> -------------------------------------------------------
> This SF.net email is sponsored by: ObjectStore.
> If flattening out C++ or Java code to make your application fit in a
> relational database is painful, don't do it! Check out ObjectStore.
> Now part of Progress Software. http://www.objectstore.net/sourceforge
-------------------------------------------------------
This SF.net email is sponsored by: ObjectStore.
If flattening out C++ or Java code to make your application fit in a
relational database is painful, don't do it! Check out ObjectStore.
Now part of Progress Software. http://www.objectstore.net/sourceforge
Received on 2003-05-27