Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: Max connections per peer / IP
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Stefan Eissing via curl-library <curl-library_at_lists.haxx.se>
Date: Thu, 30 Oct 2025 16:53:52 +0100
> Am 30.10.2025 um 15:37 schrieb Patrick Schlangen <patrick_at_schlangen.me>:
>
> Am 30.10.2025 um 15:27 schrieb Stefan Eissing <stefan_at_eissing.org>:
>> Maybe it would be good to take a step back and describe what you actually want to achieve? Is this for a private setup only or should it work on the public internet? etc.
>
> Thanks for your response.
>
> I'm creating lots of parallel connections to different (user supplied) URLs, and I'm running into situations where many of those URLs resolve to the same IP address (e.g. same edge cache node used by many different sites, same API gateway provider shared by many URLs, etc).
> This sometimes results in too many parallel connections to the same machine, which can trigger firewall / WAF rules intended for DoS protection. Sometimes excess connections are just rejected, sometimes they ban my IP address for a while.
> I would like to ensure I'm not making too many parallel requests to the same machine. (Also to be a nice citizen and not overload anyone's systems.)
>
> Hope it makes sense.
Yes, thanks for the explanation. I understand how that feature would be useful, yes.
Pondering it, I see no easy way to add that in the current connection handling. Certainly not impossible, but not trivial either.
- Stefan
> Best,
>
> Patrick
Date: Thu, 30 Oct 2025 16:53:52 +0100
> Am 30.10.2025 um 15:37 schrieb Patrick Schlangen <patrick_at_schlangen.me>:
>
> Am 30.10.2025 um 15:27 schrieb Stefan Eissing <stefan_at_eissing.org>:
>> Maybe it would be good to take a step back and describe what you actually want to achieve? Is this for a private setup only or should it work on the public internet? etc.
>
> Thanks for your response.
>
> I'm creating lots of parallel connections to different (user supplied) URLs, and I'm running into situations where many of those URLs resolve to the same IP address (e.g. same edge cache node used by many different sites, same API gateway provider shared by many URLs, etc).
> This sometimes results in too many parallel connections to the same machine, which can trigger firewall / WAF rules intended for DoS protection. Sometimes excess connections are just rejected, sometimes they ban my IP address for a while.
> I would like to ensure I'm not making too many parallel requests to the same machine. (Also to be a nice citizen and not overload anyone's systems.)
>
> Hope it makes sense.
Yes, thanks for the explanation. I understand how that feature would be useful, yes.
Pondering it, I see no easy way to add that in the current connection handling. Certainly not impossible, but not trivial either.
- Stefan
> Best,
>
> Patrick
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2025-10-30