cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Download entire website

From: Ralph Mitchell <rmitchell_at_eds.com>
Date: Fri, 09 May 2003 00:07:23 -0500

Wget is also capable of mirroring a web site. You could use curl, but you'd
have to sift through every page you download and extract every IMG, SRC,
SCRIPT and HREF tag and then pull those pages as well... Wget already does
that, including recreating the directory structure of the original site and
fixing up the links on the copied pages so that it all works like the
original.

Ralph Mitchell

"HART, CHRISTOPHER T" wrote:

> If you're looking to essentially mirror a site (multiple pages, inline
> images, etc.), you may want to take a look at a tool called pavuk. Curl is
> an awesome tool for automation of web-based processing, filling in forms,
> downloading files, etc., but pavuk is specifically designed for mirroring a
> site.
>
> Chris
>
> -----Original Message-----
> From: Reuben Pearse [mailto:Reuben.Pearse_at_presence-systems.com]
> Sent: Thursday, May 08, 2003 10:41 AM
> To: curl-users_at_lists.sourceforge.net
> Subject: Download entire website
>
> Hi there,
>
> Can I use Curl to download all the content on a website which is then
> viewable offline?
>
> Reuben
> reuben_at_pearse.co.uk

-------------------------------------------------------
Enterprise Linux Forum Conference & Expo, June 4-6, 2003, Santa Clara
The only event dedicated to issues related to Linux enterprise solutions
www.enterpriselinuxforum.com
Received on 2003-05-09