curl / Mailing Lists / curl-library / Single Mail


Re: On the right way to use HTTP/2 multiplex

From: Arnaud Rebillout via curl-library <>
Date: Thu, 21 Mar 2019 18:35:47 +0700

On 3/21/19 2:12 PM, Daniel Stenberg wrote:
>> Now, I take a bit of time to think, and I wonder if this second
>> implementation is really the smart thing to do. More precisely: by
>> feeding handles one by one (even though we might have 8 active
>> handles in curl multi at the same time), do I prevent internal
>> optimization within libcurl? How can libcurl multiplex efficiently if
>> I don't tell it in advance the list of chunks I want to download?
> It will multiplex equally good. Each new transfer you ask for will
> join an existing connection - if possible - at the time it starts.
> There's really no difference to curl, that decision is made when the
> transfer starts anyway. The main difference between your two solutions
> is that in the first case you hand over a lot of the transfer queueing
> to curl, while you do it yourself in the second case.
> Without having all the factors and knowledge of the solution that you
> do, I would say that the second solution sounds more flexible for you.
> That way gives you more room for your application to act depending on
> circumstances during the transfer.

Thanks a lot for the detailed answer! I see things a bit more clearly
now. There's still one point I'd like to discuss if you don't mind :)

One thing that still bugged me was that libcurl provides
`CURLMOPT_MAX_TOTAL_CONNECTIONS` to give a maximum number of parallel
connections (in case parallel connections are used), but no equivalent
to limit the number of parallel streams in case of multiplex. However
after more reading, it seems that this is negotiated between the server
and the client, which means that I don't have to care about that.

However it also means that I don't know what's the maximum number of
parallel downloads allowed, it depends on the server.

If my understanding is correct, it seems to me that I could underfeed
libcurl with my "second implementation". Ie, if I give curl only 8
handles at a time, while the server would be OK to have 16 downloads
multiplexed, then I'm doing poorly...

So it seems that I'd be better feeding libcurl with more handles (64 or
so), to ensure the curl multi always has enough handles in its internal
pool. Of course, I'd make sure to set CURLMOPT_MAX_TOTAL_CONNECTIONS to
something smaller, because in case multiplex can't be negotiated, I
don't want 64 parallel connections, it's too much.

Does that.... make sense?

Received on 2019-03-21