cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: GZIP encoding error - invalid code lengths set

From: Dobromir Velev <diadomraz_at_gmail.com>
Date: Fri, 14 Aug 2009 15:10:12 +0300

Well the servers that I've confirmed to have this bug are all from the
Akamai's network and from what I heard from them it has something to
do with quote "dirty cache at the remote servers". Anyway I thought it
would be faster to patch my local version of curl to ignore the ISIZE
check and was wondering if anybody has seen something like that
before.

Below is the patch that currently work with those servers - but I
agree with you that it should not be added to cURL since it basically
ignores all zlib Data errors at the end of the stream and might lead
to some strange behaviour.

--- lib/content_encoding.old.c 2009-08-13 18:06:11.000000000 +0300
+++ lib/content_encoding.c 2009-08-13 18:07:47.000000000 +0300
@@ -132,7 +132,14 @@
     else if(allow_restart && status == Z_DATA_ERROR) {
       /* some servers seem to not generate zlib headers, so this is an attempt
          to fix and continue anyway */
-
+ if(z->avail_in==0){
+ infof(conn->data,"Ignore invalid GZIP ISIZE\n");
+ if(DSIZ - z->avail_out) {
+ result = Curl_client_write(conn, CLIENTWRITE_BODY, decomp,
DSIZ - z->avail_out);
+ }
+ free(decomp);
+ return exit_zlib(z, &k->zlib_init, result);
+ }
       (void) inflateEnd(z); /* don't care about the return code */
       if(inflateInit2(z, -MAX_WBITS) != Z_OK) {
         free(decomp);

Dobromir Velev

On Fri, Aug 14, 2009 at 2:45 PM, Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Thu, 13 Aug 2009, Dobromir Velev wrote:
>
>> I was wondering if there is a way to make curl and/or zlib to silently ignore this error since the content is downloaded and decompressed correctly. I can modify content_encoding.c to return all available data and exit when a data error is detected and there is no more data to decode, but I'm not sure what will happen if more data is received at a later stage.
>>
>> Attached are two trace files which were generated a minute apart from each other with the same request. The one is OK and the other fails with
>
> Ouch. This is a seriously broken server. Have you tried to simply contact the admins of it to ask if they can fix it instead of trying to shoe-horn in a weirdo work-around in libcurl?
>
> This looks like a broken zlib stream that libcurl/zlib rightfully whines about. Making an illegal value become OK seems like a huge kludge to me!
>
> --
>
>  / daniel.haxx.se

--
Dobromir Velev
---------------------------------------------------------------------------
"Never attribute to malice that which can be
adequately explained by stupidity"
Received on 2009-08-14