curl / Docs / curl CVEs / HTTP multi-header compression denial of service
Awarded 2400 USD

CVE-2023-23916

HTTP multi-header compression denial of service

Project curl Security Advisory, February 15th 2023 - Permalink

VULNERABILITY

curl supports "chained" HTTP compression algorithms, meaning that a server response can be compressed multiple times and potentially with different algorithms. The number of acceptable "links" in this "decompression chain" was capped, but the cap was implemented on a per-header basis allowing a malicious server to insert a virtually unlimited number of compression steps simply by using many headers.

The use of such a decompression chain could result in a "malloc bomb", making curl end up spending enormous amounts of allocated heap memory, or trying to and returning out of memory errors.

INFO

Automatic content decompression of content needs to be enabled per transfer but the way Transfer-Encoding works in curl this code path and problem can be reached with default options.

This flaw exists with one or more of the compression algorithms built-in (gzip, brotli or zstd), but the individual algorithms have different "exploding" powers.

Both Content-Encoding: and Transfer-Encoding: are affected over all HTTP versions.

This flaw is almost identical to the previous CVE-2022-32206: HTTP compression denial of service, as the fix for that earlier flaw was incomplete.

CWE-770: Allocation of Resources Without Limits or Throttling

Severity: Medium

AFFECTED VERSIONS

libcurl is used by many applications, but not always advertised as such!

SOLUTION

The amount of accepted "chained" algorithms is now capped to 5 in total, independently of the number of headers.

RECOMMENDATIONS

A - Upgrade curl to version 7.88.0

B - Apply the patch to your local version

TIMELINE

This issue was reported to the curl project on January 8, 2023. We contacted distros@openwall on February 7, 2023.

libcurl 7.88.0 was released on February 15 2023, coordinated with the publication of this advisory.

CREDITS

Thanks a lot!