cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Downloading all files in a directory

From: Walt Pawley <walt_at_wump.org>
Date: Mon, 20 Sep 2004 12:37:57 -0700

On 9/16/04 8:48 AM +0100, Gavin Webb wrote on Downloading all files in a
directory

>I am new to cURL and would like to use the command line tool to download all
>files from a directory at an FTP site and similarly upload all files in a
>local directory to an FTP site. The FTP will be secure using SSL.
>
>I have tested and got working some basic commands for uploading and
>downloading a single file and getting a directory listing.
>
>I can write some script to identify all the file names using the directory
>listing - but this is a bit of a pain. Is there a command line
>switch/option to tell cURL to download/upload all files?

As Daniel pointed out, there's no "switch" to flip. But if you are going to
work at the command line, I would suggest getting to know things like SED
or, better IMHO, Perl and shell scripting to deal with these issues. Spend
a little time building the tools for your work and they can be as many
"switches" as you need.

For example, assuming that all the file names in FTP directory you want to
get are amenable (no spaces, etc.), you could use something like ...

#!/bin/sh
site=$1
files=`curl ftp://${site}|grep -e '^-'|perl -p -e 's/(.+?\s+){8}//;s/\r//;' -`
for file in ${files};do curl -O ftp://${site}/${file};done

-- 
perl -e 'foreach(qw/13 10 60 47 98 117 115 104 62 13 10 10/){print chr($_)};'
Walter M. Pawley <walt_at_wump.org>
Wump Research & Company
676 River Bend Road, Roseburg, OR 97470
         541-672-8975
Received on 2004-09-21