cURL / Mailing Lists / curl-library / Single Mail


Re: Pipelining a single download

From: Alexandre BOUIN <>
Date: Tue, 3 May 2016 17:31:29 +0200

Hello Daniel and Alex,

First, thank you both for your reactivity. I appreciate a lot !

My main question is: why?

I was expecting this question. Unfortunately, I cannot answer it. Just
consider it is a research job.

HTTP/2 might be a solution, but I would like to exploit HTTP/1.1 as far as

Or is there a chance that you're actually talking about doing N transfers
> in parallel using N connections?

I'm just downloading a single file from a single server using one connexion.

I a perfect world, with 2 pipelines, range request should continue like a
single download.
But in reality, with some latency (from 10ms to 300ms), it is not the case.
This is maybe the magic and evil ;)

We performed a first test, using python. Once half the request (20kB only)
has been downloaded, we start a new request.
This is a bit like if we were using 2 pipelines.
This way all the bandwidth is consumed.

Doing the same using licurl, with 2 pipelines, doesn't give the same

Do you have any idea about this ?

Thanks, Alex B.

2016-05-03 15:56 GMT+02:00 Alex Bligh <>:

> On 3 May 2016, at 14:46, Daniel Stenberg <> wrote:
> > My main question is: why? The only reason to do pipelining is to
> overcome latency when using HTTP/1.1 - the time from the end of the
> previous transfer until the next one starts. You're downloading a single
> resource, so do pipelining for a single file just adds a lot of complexity
> with no added benefits.
> Or to put it another way, if you want to go crazy with range headers, you
> will need n separate connections to take advantage of it, not pipelining.
> But in general you would be far better off finding out why a single
> connection does not maximise available bandwidth, and attempting to fix
> that.
> --
> Alex Bligh
> -------------------------------------------------------------------
> List admin:
> Etiquette:

List admin:
Received on 2016-05-03