cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: FTP multi-file get?

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Sun, 11 Nov 2001 20:33:38 +0100 (MET)

On Fri, 9 Nov 2001, Roth, Kevin P. wrote:

> I know this topic has come up before, however I've recently seen an
> instance where I would have liked to use curl's command line (without
> having to write any extra scripts) to do this.
>
> I know cURL wasn't intended to be a replacement for wget; in particular
> it wasn't intended to operate on more than one file at a time, without us
> doing scripting around it. However, it does support some simple
> multi-file operations using [] and {} constructs, or by specifying
> multiple URLs.

Yes, due to popular demand. Everything is a matter of demand and who's
willing to make it come true.

> Since FTP can *usually* get directory listings, I'm wondering if it
> would be difficult to add the ability to download all files in ONE ftp
> directory, e.g.:
> `curl ftp://ftp.mycompany.com/directory/* -O`
> or perhaps even "*.txt", etc...
>
> I wouldn't expect this to handle recursion (into subdirectories), or
> intelligent file-renaming (e.g. if a downloaded file already existed), or
> even time-sensitive downloads.

It would indeed be possible, as libcurl already offers a pretty easy
interface to get the data. The trickiest part, and also what causes problems
to software such as wget, is to figure out which files to download. This is
because the LIST and NLST (ftp commands) don't properly return file lists
using a defined format in which directories and files are easily detected.

> Any chance this could be considered for addition to the ToDo list?

I would not object if anyone added this feature, so yes, I can very well have
it in the TODO file.

> I'm willing to try tackling the coding for it, if I can find the time;
> however it would be easier if some experience C person could take it on
> instead...

I don't think I'll be able to do this myself within the nearest time. I'll of
course assist anyone who's willing to contribute!

-- 
    Daniel Stenberg -- curl groks URLs -- http://curl.haxx.se/
Received on 2001-11-11