cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Get complete page?

From: Doug McNutt <douglist_at_macnauchtan.com>
Date: Wed, 1 Mar 2006 08:02:29 -0700

At 04:14 -0600 3/1/06, Ralph Mitchell wrote:
>Curl just gets one file at a time. You'd need to parse the initial
>page, extract any links to images, javascript files, style sheets,
>etc, go fetch those things too, then fix up the links in the page to
>point to the local files.

For a one time download like that I recently used the log file produced by the LiveHttpHeaders addition to Foxfire.

Using perl I read the log and passed the downloaded GET's and POSTs to curl. I had to create new subdirectories on the fly and there were a few, but not many, fully defined URL's that had to be converted to local versions using text substitutions with regular expressions.

Be sure to empty Foxfire's cache before creating the log file.

Ask if you want the perl script I used. It was anything but a general application which ran on Mac OS neXt.

-- 
--> As a citizen of the USA if you see a federal outlay expressed in $billion then multiply it by 4 to estimate your share in dollars. <--
Received on 2006-03-01