cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: http stop before full page downloaded

From: skip <scp0801_at_gmail.com>
Date: Fri, 11 Feb 2011 15:47:53 -0800

The only option that seems to work for me is --max-time. I choke it
off after a few seconds and then parse the file differently. Is this
safe to use? Are there any other dependent options that affects
--max-time adversely?

The --range option does not work on this server. Don't know what to do
to fix it.

The pipe to head hangs with no output after 10 lines. The curl process
does not exit.

On Fri, Feb 11, 2011 at 3:24 PM, Christopher Stone
<listmeister_at_thestoneforge.com> wrote:
> On Feb 11, 2011, at 14:29, skip wrote:
>
> Is there a command line option to stop the file transfer before the entire
> file is transferred from an http source?
>
> My need is for getting a snapshot of an http page that is constantly being
> updated/appended through tail -f. I just want to get a few lines of the
> current http output and stop. Right now curl runs forever.
>
> ______________________________________________________________________
> Hey skip,
> As Dan said this is not guaranteed to work with all pages and servers:
> curl -sL --user-agent 'Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1'
> --range 0-100 --dump-header - 'http://curl.haxx.se/docs/httpscripting.html'
> --
> Best Regards,
> Chris
> -------------------------------------------------------------------
> List admin: http://cool.haxx.se/list/listinfo/curl-users
> FAQ:        http://curl.haxx.se/docs/faq.html
> Etiquette:  http://curl.haxx.se/mail/etiquette.html
>
>
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2011-02-12