curl / Mailing Lists / curl-users / Single Mail


Re: Parallelism is coming

From: Daniel Stenberg <>
Date: Wed, 17 Apr 2019 12:03:46 +0200 (CEST)

On Wed, 17 Apr 2019, Jeremy Nicoll wrote:

>> What else do we need to consider?
> Suppose with serial use someone wanted to fetch several pages but write the
> results to just one file, one after the other. What would happen with
> parallel fetches all writing to one file?

If you want to save the output from several transfers into a single file, then
doing them in parallel will probably lead you into an alley filled of darkness
and tears! =)

That's an excelllent use case to maintain the serial behavior. Which also has
to remain the default behavior for this and other reasons.

> Likewise how will you separate the headers (if they're being written to a
> separate file from the page content) rather than having several pges'
> headers all jumbled up?

I would guess that just about every use case for doing parallel transfers will
save those transfers into different files.


Once curl can do parallel transfers properly, I'm toying with this additional
idea to introduce a master/slave concept for curl where you can have a curl
"master" instance in the background (on the same machine) that can do
transfers for the "client" curl instances, which then would be able to get the
benefit of connection reuse, DNS caching and more from the master that is
alive through multiple invokes of the curl "client" instance. An imaginary use
case could look like this:

   curl --master &
   curl --client
   curl --client
   curl --client
   curl --kill-master

Received on 2019-04-17