Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: Limit the URL size more?
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Dimitry Andric via curl-library <curl-library_at_lists.haxx.se>
Date: Sun, 14 Dec 2025 12:28:49 +0100
On 14 Dec 2025, at 12:09, Daniel Stenberg via curl-library <curl-library_at_lists.haxx.se> wrote:
>
>
> Yesterday we received a security report [1] on curl that identified a possible performance issue. We deemed it not a security problem, but it still triggered this question:
>
> When a user sets a URL in libcurl it allows up to an 8 megabyte string. If that URL uses a maximum amount of "../" occurances, all those need to be "optimized away" in the dotdot normalization phase of the URL parsing.
>
> That makes 2,666,663 three-byte sequences to remove in the worst possible case.
>
> While asking ourselves how much faster we can make that code, I figure we could also open up the question:
>
> How long URL do we really need to support? In practise, a URL longer than a few tens of kilobytes is not likely to actually work on the internet. Most HTTP servers for example have a limit of somewhere around 8-9 kilobytes for the path component.
>
> Maybe we could save ourselves and our users some future trouble by reducing the max URL limit size? Maybe we can set it to 1 megabyte (or even lower) without any legitiate user ever noticing?
Maybe our company is a bit of an outlier, but in our product we do have an option where small-ish MP4 files (ISOBMFF containers) can be sent to a transcoding server, and these can optionally be sent via a GET request with the MP4 base64 encoded in it.
Obviously, there is a limit to how large an MP4 you can send with it, and for anything larger you must use POST or PUT, but there is still something to be said for being able to choose a limit (preferably at run time) that makes sense for your particular use case.
E.g., we usually also configured Apache with "LimitRequestLine 100000000", to allow longer requests. I think its built-in default is something quite low like 8k or 16k. But it is handy to be able to configure it.
-Dimitry
Date: Sun, 14 Dec 2025 12:28:49 +0100
On 14 Dec 2025, at 12:09, Daniel Stenberg via curl-library <curl-library_at_lists.haxx.se> wrote:
>
>
> Yesterday we received a security report [1] on curl that identified a possible performance issue. We deemed it not a security problem, but it still triggered this question:
>
> When a user sets a URL in libcurl it allows up to an 8 megabyte string. If that URL uses a maximum amount of "../" occurances, all those need to be "optimized away" in the dotdot normalization phase of the URL parsing.
>
> That makes 2,666,663 three-byte sequences to remove in the worst possible case.
>
> While asking ourselves how much faster we can make that code, I figure we could also open up the question:
>
> How long URL do we really need to support? In practise, a URL longer than a few tens of kilobytes is not likely to actually work on the internet. Most HTTP servers for example have a limit of somewhere around 8-9 kilobytes for the path component.
>
> Maybe we could save ourselves and our users some future trouble by reducing the max URL limit size? Maybe we can set it to 1 megabyte (or even lower) without any legitiate user ever noticing?
Maybe our company is a bit of an outlier, but in our product we do have an option where small-ish MP4 files (ISOBMFF containers) can be sent to a transcoding server, and these can optionally be sent via a GET request with the MP4 base64 encoded in it.
Obviously, there is a limit to how large an MP4 you can send with it, and for anything larger you must use POST or PUT, but there is still something to be said for being able to choose a limit (preferably at run time) that makes sense for your particular use case.
E.g., we usually also configured Apache with "LimitRequestLine 100000000", to allow longer requests. I think its built-in default is something quite low like 8k or 16k. But it is handy to be able to configure it.
-Dimitry
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2025-12-14