Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: --max-filesize and --compressed
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Andreas Mohr via curl-users <curl-users_at_lists.haxx.se>
Date: Mon, 2 Mar 2026 12:38:57 +0100
On Mon, Mar 02, 2026 at 09:53:34AM +0100, Daniel Stenberg via curl-users wrote:
> Would it make sense to have some kind of limit to the "explosion factor" ?
Ah, indeed, possibly.
An "explosion factor" config state seems to be
much more precision-related *) than
a raw result file size config state.
(since the actually hurting most relevant characteristic
of a compression bomb is
the *very abnormally* high explosion factor vs.
its tiny origin data, and especially vs.
plain "normal" decompression examples).
*)
- quite reliably catches compression bombs
- is more future-proof than
enacting some raw fixed file size limit (vs.
eternally growing capabilities of systems!)
OTOH with an input file of
say 1GB and explosion factor limit of
only 100 **), one would still end up with
a dangerously large 100GB file size...
**) which might be already too low for
many legitimate decompression activities
So, do instead resort to
a plain file-size-based limit after all?
Greetings
Andreas Mohr
Date: Mon, 2 Mar 2026 12:38:57 +0100
On Mon, Mar 02, 2026 at 09:53:34AM +0100, Daniel Stenberg via curl-users wrote:
> Would it make sense to have some kind of limit to the "explosion factor" ?
Ah, indeed, possibly.
An "explosion factor" config state seems to be
much more precision-related *) than
a raw result file size config state.
(since the actually hurting most relevant characteristic
of a compression bomb is
the *very abnormally* high explosion factor vs.
its tiny origin data, and especially vs.
plain "normal" decompression examples).
*)
- quite reliably catches compression bombs
- is more future-proof than
enacting some raw fixed file size limit (vs.
eternally growing capabilities of systems!)
OTOH with an input file of
say 1GB and explosion factor limit of
only 100 **), one would still end up with
a dangerously large 100GB file size...
**) which might be already too low for
many legitimate decompression activities
So, do instead resort to
a plain file-size-based limit after all?
Greetings
Andreas Mohr
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2026-03-02