cURL / Mailing Lists / curl-users / Single Mail

curl-users

FTP multi-file get?

From: Roth, Kevin P. <KPRoth_at_MarathonOil.com>
Date: Fri, 9 Nov 2001 10:00:07 -0500

I know this topic has come up before, however I've recently seen an
instance where I would have liked to use curl's command line (without
having to write any extra scripts) to do this.

I know cURL wasn't intended to be a replacement for wget; in particular
it wasn't intended to operate on more than one file at a time, without
us doing scripting around it. However, it does support some simple
multi-file operations using [] and {} constructs, or by specifying
multiple URLs.

Since FTP can *usually* get directory listings, I'm wondering if it
would be difficult to add the ability to download all files in ONE ftp
directory, e.g.:
  `curl ftp://ftp.mycompany.com/directory/* -O`
or perhaps even "*.txt", etc...

I wouldn't expect this to handle recursion (into subdirectories), or
intelligent file-renaming (e.g. if a downloaded file already existed),
or even time-sensitive downloads. I know wget already does all this. But
for someone that's already familiar with cURL command-line options but
isn't familiar with wget's options (i.e. me), I found the array of
options available in wget to be quite confusing.

Any chance this could be considered for addition to the ToDo list? I'm
willing to try tackling the coding for it, if I can find the time;
however it would be easier if some experience C person could take it on
instead...

Thanks,
--Kevin
Received on 2001-11-09