curl-users
Re: http stop before full page downloaded
Date: Sun, 13 Feb 2011 09:07:45 -0800
On Fri, Feb 11, 2011 at 5:57 PM, Dan Fandrich <dan_at_coneharvesters.com> wrote:
> On Fri, Feb 11, 2011 at 03:47:53PM -0800, skip wrote:
>> The only option that seems to work for me is --max-time. I choke it
>> off after a few seconds and then parse the file differently. Is this
>> safe to use? Are there any other dependent options that affects
>> --max-time adversely?
>
> There shouldn't be in this situation.
>
>> The --range option does not work on this server. Don't know what to do
>> to fix it.
>
> It's likely that the URL is dynamically generated, in which case most
> servers simply won't support ranges.
>
>> The pipe to head hangs with no output after 10 lines. The curl process
>> does not exit.
>
> curl won't exit until head does. I don't know what sort of output is being
> sent from the server, but if it's not terminated with CR/LF, then head
> won't consider that a line has been received. And if you're using Windows,
> then this probably isn't going to work, anyway.
>
>>>> Dan
curl -sL -m 4 --no-buffer --dump-header --range 0-100 'http://www.host.com'
I've settled on the above line as it gives enough for my tests, though
it takes 4 seconds because I added a couple of seconds to account for
network latency. Range option is redundant for me as the servers don't
recognize that. All the text is dynamically generated on the http
sever. Each line is terminated by a \n. Don't know about Windows but
it works on Mac and Ubuntu.
Thanks for the help.
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-users
FAQ: http://curl.haxx.se/docs/faq.html
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2011-02-13