cURL / Mailing Lists / curl-users / Single Mail


Re: Limits to curl command or how to download a long list of URL.

From: Rodrigo Zanatta Silva <>
Date: Sun, 4 Jan 2015 14:49:03 -0200

Dar.. Why use older software version... I install mac port and update the
curl.. Don't be lazzy :D

Now it is:

curl 7.39.0 (x86_64-apple-darwin14.0.0) libcurl/7.39.0 OpenSSL/1.0.1j
zlib/1.2.8 libidn/1.29

GNU bash, version 4.3.30(1)-release (x86_64-apple-darwin14.0.0)

2015-01-04 14:22 GMT-02:00 Rodrigo Zanatta Silva <>:

> HI!
> I have a (really) big list of URL to download. Every file is a small html
> about 1k size. One easy thing was download one by one creating a big bash
> script.
> But I discovery it can be faster if I send various URL at once to curl.
> Maybe because it need to start a connection, download and close connection,
> when I send various at same time, the curl do this things as fast as
> possible.
> So one strategy is use braces. My URL is not numeric and don't have an
> easy logic. So, I will create a big command file with
> curl http://site.{one,two,three}.com
> How long can be my command? I am using
> *OSX Yosemite 10.10.1 *
> *Bash: version 3.2.53(1)-release*
> *curl: curl 7.37.1 (x86_64-apple-darwin14.0) libcurl/7.37.1
> SecureTransport zlib/1.2.5*
> (Hum.. Maybe I need update it, and for newer bash and curl, it change the
> limit?)
> Is there another strategy? Maybe a list of file. How can I config it to
> download all URL in an txt file, one URL per line (in an efficient way, not
> transform this file making curl download one by one)
> And I will use various threads, so I can start various command at same
> time, and every command will use the strategy that I am asking now.

List admin:
Received on 2015-01-04