Buy commercial curl support from WolfSSL. We help you work
out your issues, debug your libcurl applications, use the API, port to new
platforms, add new features and more. With a team lead by the curl founder
himself.
Re: Total http/2 concurrency for multiplexed multi-handle
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Jeroen Ooms via curl-library <curl-library_at_lists.haxx.se>
Date: Fri, 10 Feb 2023 12:42:13 +0100
On Thu, Feb 9, 2023 at 1:31 PM Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Thu, 9 Feb 2023, Jeroen Ooms wrote:
>
> > OK, I had expected multiplexing to replace the need for
> > multi-connections.
>
> It does up to the point where the connection is "full" of streams and you ask
> for even more transfers. Then libcurl creates a new connection. Unless you
> limit the number of connections it can use.
Ah ok that is better than I thought. I was under the impression that
it would immediately start with 6 connections, even before considering
multiplexing.
> > Do browsers still make multiple connections to hosts that support http/2
> > multiplex?
>
> My guess: browsers probably only do that in certain situations but mostly no.
>
> > Perhaps a desirable default would be to do one or another, but not both?
>
> If you want to limit the number of used connections, libcurl offers the
> options to do so. Or you can wait with adding some of the transfer(s). The
> default libcurl behavior is generally to perform the transfer you ask for
> sooner rather than later.
Right. In my case I want to download 25k files, and let curl handle
the scheduling.
However I noticed that even when setting CURLMOPT_MAX_HOST_CONNECTIONS
to 1, GitHub still drops the connections at some point. Perhaps the
issue isn't just the concurrency but we are hitting a
proxy_read_timeout or something, because the downloads are idling for
too long, due to the high concurrency...
I did find that the problems disappear when I disable multiplexing,
and performance isn't much worse (about 6 minutes for downloading the
25k files), so this solves my immediate problem.
Date: Fri, 10 Feb 2023 12:42:13 +0100
On Thu, Feb 9, 2023 at 1:31 PM Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Thu, 9 Feb 2023, Jeroen Ooms wrote:
>
> > OK, I had expected multiplexing to replace the need for
> > multi-connections.
>
> It does up to the point where the connection is "full" of streams and you ask
> for even more transfers. Then libcurl creates a new connection. Unless you
> limit the number of connections it can use.
Ah ok that is better than I thought. I was under the impression that
it would immediately start with 6 connections, even before considering
multiplexing.
> > Do browsers still make multiple connections to hosts that support http/2
> > multiplex?
>
> My guess: browsers probably only do that in certain situations but mostly no.
>
> > Perhaps a desirable default would be to do one or another, but not both?
>
> If you want to limit the number of used connections, libcurl offers the
> options to do so. Or you can wait with adding some of the transfer(s). The
> default libcurl behavior is generally to perform the transfer you ask for
> sooner rather than later.
Right. In my case I want to download 25k files, and let curl handle
the scheduling.
However I noticed that even when setting CURLMOPT_MAX_HOST_CONNECTIONS
to 1, GitHub still drops the connections at some point. Perhaps the
issue isn't just the concurrency but we are hitting a
proxy_read_timeout or something, because the downloads are idling for
too long, due to the high concurrency...
I did find that the problems disappear when I disable multiplexing,
and performance isn't much worse (about 6 minutes for downloading the
25k files), so this solves my immediate problem.
-- Unsubscribe: https://lists.haxx.se/listinfo/curl-library Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2023-02-10