curl-library
Re: 7.16.0 regression with large files
Date: Thu, 11 Jan 2007 16:49:23 -0800
On 11 January 2007, at 04.40.22, Daniel Stenberg wrote:
> On Wed, 10 Jan 2007, Toby Peterson wrote:
>
>> There appears to be a regression in 7.16.0 when downloading large
>> files. Essentially, on a file larger than 4GB, it cuts off with
>> exactly 4GB remaining.
>
> How weird. I don't see what in libcurl that even knows about that
> number and even less would care about it. Are you sure you get a
> sane download from this server?
With 7.15.5, it works fine. Compared SHA1, etc.
> What if you transfer a file that is 4GB+2 bytes, do you only get
> the 2 bytes then before it gets cut off?
Just tested this, and that's exactly what happens.
>> Seems like an off-by-one error
>
> How so?
Seemed like it might be running into a comparison issue exactly when
4GB remained, but I'd say the typecast Matt Witherspoon pointed out
is more likely.
> http://cool.haxx.se/cvs.cgi/curl/lib/sendf.c.diff?
> r2=1.111&r1=1.110&diff_format=u
>>
>> What particularly confuses me is Curl_readwrite() in transfer.c
>> stops reading if (0 >= nread), meaning that Curl_read() "succeeds"
>> when nread == 0, but Curl_readwrite() "fails". Anyway, I hope this
>> is sufficient information to resolve the issue.
>
> While I agree with you it looks odd, I don't see how that could
> cause a problem that would cut off transfers when they have 4GB
> left to download... I mean, why would recv() suddenly return 0?
>
> The very last line in that diff seems unmotivated though. I can't
> see any explanation to why that was changed from += to =...
Yeah, just seemed off when I was looking at it, but again - Matt
Witherspoon is probably correct. Going to investigate that typecast a
bit more.
- Toby
Received on 2007-01-12