curl-users
Re: Backing up a site with a huge number if subdomains?
Date: Mon, 20 Dec 2004 04:29:09 -0500
Hi,
I never had to backup across subdomains. I usually backup based on the
directory structure or based on the site (ie. download everything
withing a single domain) and I tell curl to delimit based on that.
Subdomains have me stumped :(.
On Mon, 20 Dec 2004 10:24:03 +0100 (CET), Daniel Stenberg
<daniel-curl_at_haxx.se> wrote:
> On Mon, 20 Dec 2004, BigHype wrote:
>
> > I have been using curl for quite some time but I've never encountered this
> > problem before. I usually backup sites that I like because they tend to
> > dissapear quite often and I use curl to do that. I came across an
> > interesting site that has all of its content spread across many subdomains.
> > Subdomains are of the format sub1.domain.com and I am looking on some
> > suggestions on how to back it up.
>
> So how did you do it before and why does this cause you anything extra? Give
> us some details please.
>
> --
> Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
> Dedicated custom curl help for hire: http://haxx.se/curl.html
>
Received on 2004-12-20