cURL / Mailing Lists / curl-users / Single Mail

curl-users

RE: doing multiple file download

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Tue, 19 Dec 2000 08:48:08 +0100 (MET)

On Mon, 18 Dec 2000, Stenberg Björn wrote:

> Would it be messy to also support this model:
>
> curl -o file1 http://url1 -o file2 http://url2
>
> That way, the command line is more easily built from shell scripts and such.

Simple answers such as this reminds me why it is a good thing to always post
questions and thoughts. I don't know why, but I didn't think of this very
simple and elegant solution.

Of course, the -o and the URLs don't necessarily come next to each other
which might make it more messy if people like specify four URLs but only
three -o etc. But that's not a big problem.

I wonder if we should encourage this syntax instead:

        curl -o file1 --url http://url1 -o file2 --url http://url2

> Another suggestion I've heard is the opposite: one-file-many-connections,
> the idea being to circumvent per-connection bandwidth caps.

Yes, but I'd imagine that can be done with the current implementation of the
library. It is more of a matter of using the interface to request ranges.

> > To the average user, the difference between using separate
> > connections or a single one can't be noticable.
>
> Well, some ftp servers do take horribly long time to log in to. Also,
> some servers are frequently full, meaning if you disconnect between files
> you risk being denied entrance.

True, when using ftp it is a bit different than for http of course.

> > Anyone want to participate on this and find their way to fame
> > and glory? ;-)
>
> I'm in. Just point me in the right direction. ;)

Neato!

The file to start digging in would be: curl/src/main.c

URLs can be specified in two ways. When it is no option at all or with the
--url option. It should instead build a linked list with URLs and when all
options are parsed it should loop through all available URLs and download
then. For all single specified URLs, they may themselves contain urlglobbing,
so they too can loop a few times.

The -o file names should also build a linked list and there should be one
extracted for every URL that is used.

If there is a different amount of URLs and output file names given, an error
should probably be given before any download takes places.

Consideration should be taken so that -T (upload) still works and possibly
extend that at the same time so that multiple files can be uploaded.

-- 
  Daniel Stenberg -- curl project maintainer -- http://curl.haxx.se/
Received on 2000-12-19