cURL / Mailing Lists / curl-library / Single Mail

curl-library

RE: Downloading unlimited file. Some clarification please.

From: Fadi Kahhaleh <fkahhaleh_at_hotmail.com>
Date: Tue, 3 Jan 2012 18:16:53 +0000

Hi Daniel, I thought maybe there was a failsafe to prevent bad usage from an attacker (as I read in one of the LibCurl options)But now that you clarify that this is not a problem, it is good to know. the buffer size I usually receive (taken from a log file i write to) are similiar to:
10291

3155
9304
but sometimes i get383376459 and things go bad once i start to get a lot of those smaller packets.(like the consumer is consuming so much faster than curl is providing me to produce into the buffer) however, after sending this yesterday, in one of my tries to troubleshoot, the problem happend almost 5 seconds
after running the application which clearly means it is not a large file issue or (endless file issue). The closed connection and wrong call back.. I totally forgot about me openning an initial connection to download
some headers and then close. So I exit the callback method with return 0 once I get enough bytes to make up a header. I re-establish the connection later on once I am ready to take on full processing of the file. would you say that getting small chunks is normal?
i am still invistigating the issue could be that I am running into starvation because of a bad synchronization or such
so I am just exploring all options at the minute to speed up my learning curve and resolution. Thank you for your time and efforts in this great library.Regards,

 

Fadi .K

> Date: Tue, 3 Jan 2012 15:35:34 +0100
> From: daniel_at_haxx.se
> To: curl-library_at_cool.haxx.se
> Subject: Re: Downloading unlimited file. Some clarification please.
>
> On Tue, 3 Jan 2012, Fadi Kahhaleh wrote:
>
> > I am using LibCURL to download files from a local server in my network. One
> > gotcha about the file I am downloading is that it is endless!
>
> ...
>
> > is there any option that i need to set?
>
> No. That's a perfectly normal HTTP use case. To libcurl it doesn't matter if
> it ever ends or not.
>
> > after some period (which is random at best) my call back function starts to
> > get small buffer sizes (i.e 300-ish bytes)compared to the a few hundered
> > Kbytes i guess.
>
> You guess wrong and this somehow tells us something. You see it get "small"
> buffers but you have no idea how large they were before?
>
> libcurl uses maximum 16K buffers unless you modified the build of it.
>
> > I tried to download a fixed file size (but tried a large file approx. 350 Mb
> > in size) and it caused the same behavior.
>
> So then we can rule out that it is related to the never-ending stream, right?
>
> > * Failed writing body (0 != 16384)
> > * Closing connection #0
>
> This indicates your write callback doesn't return the proper value.
>
> --
>
> / daniel.haxx.se
> -------------------------------------------------------------------
> List admin: http://cool.haxx.se/list/listinfo/curl-library
> Etiquette: http://curl.haxx.se/mail/etiquette.html
                                               

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2012-01-03