Re: Connection re-use w.r.t HTTP/1.1 and HTTP/2.0
Date: Tue, 24 Dec 2019 12:19:44 +0530
On Thu, Dec 19, 2019 at 1:45 AM Kunal Ekawde <kunalekawde_at_gmail.com> wrote:
>
> On Wed, Dec 18, 2019 at 6:36 PM Daniel Stenberg <daniel_at_haxx.se> wrote:
> >
> > On Wed, 18 Dec 2019, Kunal Ekawde wrote:
> >
> > > Ok, here is the example I've verified above behavior with local HTTP/2
> > > server, this is based on http2-download.c example
> >
> > Thank you, that certainly pin-pointed your observation nicely!
> >
> > The bug is here:
> >
> > https://github.com/curl/curl/blob/14f8b6e69e97e60f43c3188d2e22c10f05554a10/lib/url.c#L3602-L3610
> >
> > ... I'm pretty sure it comes from old Pipelining logic that is just wrong now.
> >
> > Here's my PR to fix it:
> >
> > https://github.com/curl/curl/pull/4732
> >
>
> Sure, thanks a lot for addressing this, I shall also verify with my other tests.
>
Sorry for delay in verifying this. I tried with same example with
multiple calls e.g 100, 200, .. 1000.
It is reusing the same connection. Although I noticed at with this
example, it could max initiate 1020 transfers, post that
it wouldn't initiate the transfers -- not sure if this is because of
total max connection limit of 1024 specified or my local server.
But the concern here is that, seems like its not considering any upper
limit. Do we already have the logic to check for some
upper limit (like max concurrent connections) ?
> > --
> >
> > / daniel.haxx.se | Get the best commercial curl support there is - from me
> > | Private help, bug fixes, support, ports, new features
> > | https://www.wolfssl.com/contact/
>
>
>
> --
> ~Kunal
-- ~Kunal ------------------------------------------------------------------- Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library Etiquette: https://curl.haxx.se/mail/etiquette.htmlReceived on 2019-12-24