cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: page load times using curl

From: Daniel Stenberg <daniel_at_haxx.se>
Date: Sat, 19 May 2001 15:09:44 +0200 (MET DST)

On Fri, 18 May 2001, Chris Kirby wrote:

(Andres already posted a tool of his that solves this issue, I wanted to
offer my view on this.)

> I was wondering if there is a way to use curl to record the number of
> seconds it takes to load the entire page for specific URL, including ALL
> text and images? It must follow all links required to build that page as
> some images are served from different directories.
>
> I can only seem to get curl to read the text portion of a URL.

Curl only gets the URL(s) you tell it to. In order to get all images and
stuff, you'd have to parse the first HTML page and then pass all the images'
URLs to curl.

> If curl isn't the best fit for this, maybe something else exists?

Curl isn't the best for this. I'd guess that wget is a better suited tool. Or
perhaps something like Ionax (http://freshmeat.net/projects/ionax/) which
claims to "measures the effective size of a Web page in bytes by adding up
the HTML and inline dependencies' individual sizes."

-- 
     Daniel Stenberg -- curl dude -- http://curl.haxx.se/
Received on 2001-05-19