curl-users
RE: [progress report] multiple URLs
Date: Mon, 8 Jan 2001 16:34:59 +0100
Wow, I like that.
The only other thing that comes to mind is support for multiple
*simultaneous* downloads. This, of course, leads us off to the topic of
multithreading and the related portability issues.
Do you feel really adventurous?-)
Jörn
> -----Original Message-----
> From: Daniel Stenberg [mailto:daniel_at_haxx.se]
> Sent: Friday, January 05, 2001 4:24 PM
> To: Curl Mailinglist
> Subject: [progress report] multiple URLs
>
>
> Hi curlers
>
> I've spent some time after the recent release to fiddle with
> the new multiple
> URL support (not to mention I've renamed what felt like a
> million internal
> symbols).
>
> Currently, I can fetch multiple URLs with a single simple
> command like, like
> the following:
>
> (two pages to stdout)
> % curl haxx.se contactor.se
>
> (the same pages stored in two files, using different option positions)
> % curl haxx.se contactor.se -o url1 -o url2
> % curl -o url1 -o url2 haxx.se contactor.se
> % curl -o url1 haxx.se contactor.se -o url2
>
> Note that there's always one output option for each URL. Or a
> missing output
> option means stdout. Using '-o -' also goes to stdout.
>
> Similarly, --url is also supported to specify URLs, like in:
>
> % curl --url haxx.se -o url1 --url curl.haxx.se -o curldump
>
> (which makes more sense if used in a config file)
>
> I can also get the remote URL path name (-O) for the URLs, like:
>
> % curl -O www.site.com/file.html -O www.anothersite.com/file2.html
>
> Or we can do globbing individually on a per URL basis:
>
> % curl -O www.site.com/file.html -o dump#1.dump
www.site.com/file[1-9].txt
... is there anything more I should consider? It felt a little bit too easy!
;*)
-- Daniel Stenberg -- curl project maintainer -- http://curl.haxx.se/Received on 2001-01-08