curl-library
Re: On the right way to use HTTP/2 multiplex
Date: Thu, 21 Mar 2019 15:02:57 +0100 (CET)
On Thu, 21 Mar 2019, Arnaud Rebillout wrote:
> Thanks a lot for the detailed answer! I see things a bit more clearly
> now.
One of these days I'll figure out a place where this stuff should be
documented and write something about it there...
> One thing that still bugged me was that libcurl provides
> `CURLMOPT_MAX_TOTAL_CONNECTIONS` to give a maximum number of parallel
> connections (in case parallel connections are used), but no equivalent to
> limit the number of parallel streams in case of multiplex.
Just about everything in libcurl is primarily driven by what we as a community
have wanted from it. Lots of people have asked for ways to limit how many
concurrent connection it uses. Not a single user has so far asked for a way to
limit the number of streams. Hence we have no support for that.
> However after more reading, it seems that this is negotiated between the
> server and the client, which means that I don't have to care about that.
In HTTP/2 the max number of parallel streams is negotiated, yes. It is almost
always 100 or 128 but occasionally something different.
> However it also means that I don't know what's the maximum number of
> parallel downloads allowed, it depends on the server.
> If my understanding is correct, it seems to me that I could underfeed
> libcurl with my "second implementation". Ie, if I give curl only 8 handles
> at a time, while the server would be OK to have 16 downloads multiplexed,
> then I'm doing poorly...
Well, perhaps, You then assume that you need many streams to be effective,
while a server typically can saturate a pipe fine with a single stream if it
just has the data. There's nothing that says you get a higher total bandwidth
with 16 streams than with 8 or 4.
Of course you might have some insights into the server end of things for this
use case and can make an educated guess, but then maybe you can then also make
an educated guess on how many parallel transfers to keep?
> So it seems that I'd be better feeding libcurl with more handles (64 or so),
> to ensure the curl multi always has enough handles in its internal pool. Of
> course, I'd make sure to set CURLMOPT_MAX_TOTAL_CONNECTIONS to something
> smaller, because in case multiplex can't be negotiated, I don't want 64
> parallel connections, it's too much.
>
> Does that.... make sense?
If you think you might need up to 64 parallel transfers to get the maximum
throughput, then I it makes perfect sense to me, yes.
-- / daniel.haxx.se ------------------------------------------------------------------- Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library Etiquette: https://curl.haxx.se/mail/etiquette.htmlReceived on 2019-03-21