Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: HSTS cache cap allows eviction of security entries
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Dan Fandrich via curl-library <curl-library_at_lists.haxx.se>
Date: Wed, 1 Apr 2026 14:12:52 -0700
On Wed, Apr 01, 2026 at 10:55:07PM +0200, Daniel Stenberg via curl-library wrote:
> Today I implemented a cap in how many HSTS entries libcurl keeps in memory, to prevent it from
> being a never-ending growth that could eventually cause problems. I set the limit to 1000 entries,
> quite arbitrarily.
This strikes me as a very low limit, especially since HSTS is intended as a
security measure and having a limit at all is only there for DoS protection.
Given that an HSTS entry doesn't take much space (maybe 100 bytes each per
host), a limit of 1000 is only going to take 0.0001 GB of space. IMHO bumping
it another 2 or 3 orders of magnitude is more appropriate if this is going to
be a hard limit. Three orders of magnitude is only a million hosts which a
crawler with a gigabit Internet connection could reach in a mere 30 seconds,
after which it would start expunging entries, making HSTS useless for it.
Dan
Date: Wed, 1 Apr 2026 14:12:52 -0700
On Wed, Apr 01, 2026 at 10:55:07PM +0200, Daniel Stenberg via curl-library wrote:
> Today I implemented a cap in how many HSTS entries libcurl keeps in memory, to prevent it from
> being a never-ending growth that could eventually cause problems. I set the limit to 1000 entries,
> quite arbitrarily.
This strikes me as a very low limit, especially since HSTS is intended as a
security measure and having a limit at all is only there for DoS protection.
Given that an HSTS entry doesn't take much space (maybe 100 bytes each per
host), a limit of 1000 is only going to take 0.0001 GB of space. IMHO bumping
it another 2 or 3 orders of magnitude is more appropriate if this is going to
be a hard limit. Three orders of magnitude is only a million hosts which a
crawler with a gigabit Internet connection could reach in a mere 30 seconds,
after which it would start expunging entries, making HSTS useless for it.
Dan
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2026-04-01