Buy commercial curl support. We
help you work out your issues, debug your libcurl applications, use the API,
port to new platforms, add new features and more. With a team lead by the
curl founder Daniel himself.
Re: Limit the URL size more?
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Peter Krefting via curl-library <curl-library_at_lists.haxx.se>
Date: Mon, 15 Dec 2025 09:16:32 +0100 (CET)
Hi!
> How long URL do we really need to support? In practise, a URL longer
> than a few tens of kilobytes is not likely to actually work on the
> internet. Most HTTP servers for example have a limit of somewhere
> around 8-9 kilobytes for the path component.
The Web UI framework we're using sends all the indices of a
multi-select in a list as a long comma-separated GET parameter list;
in our case we had a list of up to 2000 entries, which just below
10000 characters (5 times 2000, give or take), so we had to change the
web server configuration to allow that.
Some of this traffic can go through a proxy server we've written that
uses libcurl to talk to the backend.
Back when I was working for Opera Software, we used to have a
fixed-size buffer that limited the maximum URL size, but I did write
some code that detected longer links and dynamically allocated a
buffer when it found one; this meant the URL size was practically only
bound by the amount of available/addressable memory (we especially
needed this when we implemented data URLs).
> Maybe we could save ourselves and our users some future trouble by reducing
> the max URL limit size? Maybe we can set it to 1 megabyte (or even lower)
> without any legitiate user ever noticing?
I think that would work fine for everything except for data URLs, but
libcurl doesn't support that anyway.
Date: Mon, 15 Dec 2025 09:16:32 +0100 (CET)
Hi!
> How long URL do we really need to support? In practise, a URL longer
> than a few tens of kilobytes is not likely to actually work on the
> internet. Most HTTP servers for example have a limit of somewhere
> around 8-9 kilobytes for the path component.
The Web UI framework we're using sends all the indices of a
multi-select in a list as a long comma-separated GET parameter list;
in our case we had a list of up to 2000 entries, which just below
10000 characters (5 times 2000, give or take), so we had to change the
web server configuration to allow that.
Some of this traffic can go through a proxy server we've written that
uses libcurl to talk to the backend.
Back when I was working for Opera Software, we used to have a
fixed-size buffer that limited the maximum URL size, but I did write
some code that detected longer links and dynamically allocated a
buffer when it found one; this meant the URL size was practically only
bound by the amount of available/addressable memory (we especially
needed this when we implemented data URLs).
> Maybe we could save ourselves and our users some future trouble by reducing
> the max URL limit size? Maybe we can set it to 1 megabyte (or even lower)
> without any legitiate user ever noticing?
I think that would work fine for everything except for data URLs, but
libcurl doesn't support that anyway.
-- \\// Peter - http://www.softwolves.pp.se/ Please consider the environment before using AI functionality to reply to this email. -- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2025-12-15