curl-library
Memory leak during gzip decompression
Date: Thu, 07 May 2009 19:54:27 +0200
Hi,
When using libcurl to access some sites with CURLOPT_ENCODING set to "" and the site returning data with
Content-Encoding: gzip, libcurl (or zlib) leaks memory.
Example url: http://search.live.com/results.aspx?go=&form=QBLH&q=%22james%22
I believe the problem stems from the fact that search.live.com returns data that is gzip compressed, but where the ISIZE
field is wrong.
The relevant RFC http://tools.ietf.org/html/rfc1952 under section "2.3.1.2.Compliance" doesn't require the examination
of the trailing CRC32 or ISIZE fields (although, no sane decoder does that) and no browser throws an error when the
ISIZE field is incorrect (since I guess Transfer-Encoding: chunked already ensures that the data is received in whole)
so my guess is that:
a. The memory leak shouldn't be present, even for slightly malformed data.
b. The current behaviour that silently forgives this deviation from the RFC should be left as-is.
I don't know enough about content_encoding.c and zlib to track this further down unfortunately...
Regards,
Bálint Szilakszi
libcurl - Perl binding maintainer
Received on 2009-05-07