cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: Reciving buffer size less than CURL_MAX_WRITE_SIZE

From: ilavarasan M <m.ilavarasan_at_gmail.com>
Date: Sat, 5 Jan 2013 13:39:04 +0530

Hi,

I'm really stuck up with this problem. can you please help out..? is there
anything I need to check to solve this issue?

Regards,
Ilavaa

On Fri, Dec 28, 2012 at 12:59 PM, ilavarasan M <m.ilavarasan_at_gmail.com>wrote:

> Hi,
>
> Thanks for your reply. I'm yet to investigate the network connection also
> I can't get the server side logs now. From my analyzis I found there will
> be no network problems happend because when i ran separate utility which
> downloads 60GB file successfully. In my previous program, I do the
> following process
>
> 1. Download the contents from remote url
> 2. In the call back function, data will be copied to another buffer which
> size is 10 MB.
> 3. And once the buffer is filled 10 MB i'll send to another location.
> 4. Until the buffer become empty, curl will not download any contents.
> 5. Once buffer emptied curl starts downloading again.
>
> I feel that, Curl is exited due to the sending to the another location
> time is high. Is there any time out or delay for curl call back function.
> I'm able to reproduce the issue by put sleep in the call back function. If
> there is no delay in downloading content, everything works fine. I have
> attaced my code.
>
> Is there any option I need to set for this delayed response?
>
> Thanks & Regards,
> Ilavaa.
>
> On Fri, Dec 21, 2012 at 6:02 PM, Daniel Stenberg <daniel_at_haxx.se> wrote:
>
>> On Fri, 21 Dec 2012, ilavarasan M wrote:
>>
>> I'm using the latest version 7.28.1 to download the files. Im trying to
>>> download the file which is 60 GB in size using cURL (C++). I'm able to get
>>> around 17 GB of content and suddenly cURL come out with return code as
>>> success and the http response code is 200. I try to debug and found that
>>> curl comes out when it receiving the buffer size less than 16K in the
>>> callback function. I'm getting random values like 4K or 13K. I can confirm
>>> the issue is because of this.
>>>
>>
>> No. That is just another sympthom, not the cause, of the problem. libcurl
>> will provide you with as much data as it was able to read from the socket.
>>
>>
>> Is there any way to fix this?
>>>
>>
>> We don't know what your problem is only based on this. We don't know what
>> the problem is!
>>
>> If I were to investigate that problem I'd A) check the server side logs
>> for when the failure occurs and B) wireshark the network and investigate
>> the TCP action at the time of the transfer stop.
>>
>> So curl_easy_perform() returns OK for this transfer? Is that an old-style
>> HTTP close connection to indicate end of transfer?
>>
>>
>> can i set the CURL_MAX_WRITE_SIZE
>>>
>>
>> That's a compile-time value you can alter when you rebuild libcurl, but
>> you changing that won't fix this problem...
>>
>> --
>>
>> / daniel.haxx.se
>> ------------------------------**------------------------------**-------
>> List admin: http://cool.haxx.se/list/**listinfo/curl-library<http://cool.haxx.se/list/listinfo/curl-library>
>> Etiquette: http://curl.haxx.se/mail/**etiquette.html<http://curl.haxx.se/mail/etiquette.html>
>>
>
>

-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2013-01-05