cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Bug report: curl win32 ftruncate64 not working correctly.

From: gdb007 <gdb007.mail_at_gmail.com>
Date: Fri, 14 Dec 2007 14:16:45 +0800

On Dec 14, 2007 5:08 AM, Daniel Stenberg <daniel_at_haxx.se> wrote:
> On Thu, 13 Dec 2007, gdb007 wrote:
>
> > It seems curl built for win32 are using internal ftruncate64
> > function to truncate file when retry downloading. but the ftruncate64
> > is only doing file seeking, write 0 byte at the resizing position, and
> > then seek back to original position. which won't resize the file
> > actually so the file is still kept with downloaded data before
> > retrying.
>
> So its a truncate function that doesn't truncate? Now that's usefull... ;-)
>
> > _chsize_s should work as a replacement for ftruncate64 in win32, it use
> > 64-bit integer as the file size, and therefore can handle file sizes greater
> > than 4 GB. you may view
> > http://msdn2.microsoft.com/en-us/library/whx354w1(VS.80).aspx for function
> > details.
>
> That page says it is "specific to Microsoft Visual Studio 2005/.NET Framework
> 2.0". If I click on the link for "Microsoft Visual Studio 2003/.NET Framework
> 1.1" (long names indeed) it says _chsize() only and that one just takes a
> long.
>
> Does this mean that _chsize_s() somehow only works with VS2005 or later?
>
Well, you are right, the _chsize_s() function require VS2005 or .Net
2.0, i was assuming it should work on old version of windows as it is
a CRT function and the windows 95 are even supported in Compatibility
Requirements on the web page. so it might not be a good idea to use
this non-standard function causing incompatible with old windows
system.
_chsize() works on all version of windows but don't support large
file. there is another API SetEndOfFile can do the job, but which
won't suit curl as it require win32 HANDLE as input and need lots of
code changes.
Anyway, the current ftruncate64 function on win32 does nothing
actually, and the following fseek(outs.stream, 0, SEEK_END) will
always making the file corrupted if there is some data downloaded
before retry, no matter for small or large file. using 32 bit _chsize
or undefine HAVE_FTRUNCATE for win32 and use _lseeki64 back to initial
position is still better as which only cause problem for files larger
than 4Gb and which is rarer case.

btw: if you look
http://msdn2.microsoft.com/en-us/library/1yee101t(VS.80).aspx for
_lseeki64() document you will see it have exactly same requirements as
_chsize_s()...

> > Another possible bug is: curl with --retry option set won't retry if
> > connection lost during downloading. it will show the message "transfer
> > closed with xxxx bytes remaining to read" and then exit with
> > CURLE_PARTIAL_FILE or CURLE_GOT_NOTHING (very rare case when connection is
> > closed with no any data received yet). I am not sure if this is working as
> > intended or i am missing with some options.
>
> It is intended. I made the --retry only retry on transient errors, and those
> errors are not really transient or even likely to not happen again on retries.
> If curl would be made to retry on those, where would we draw the line?
>
> > i suggest the feature to allow retrying for those files not fully downloaded
> > (in case you know the file size and there is remaining bytes from server) if
> > connection is lost
>
> CURLE_PARTIAL_FILE basically? Yeah, I could agree to that if is made as a
> separate option.
>
> > also the option to do not throw away partly downloaded data and truncate the
> > file will be helpful when downloading large file and network is slow or
> > unstable. (e.g: you downloaded 100Mb for a 200Mb file and then timeout, it
> > is good to keep those 100Mb data before retrying).
>
> I guess that makes sense. I don't really remember exactly why I made the
> functionality as it is today.
>
> Improving --retry is not a priotity to me (quite honestly, I've _never_ used
> the option in real life), please consider jumping in to help out here if you
> want this happening anytime soon.
>
> --
> Commercial curl and libcurl Technical Support: http://haxx.se/curl.html
>
Yeah i understand, so i think i might trying to adjust the code a bit
to suit my needs or just writing some shell scripts to handle it. Open
source software is good in such case isn't it. :)

And thanks for making this sweet tool, powerful and flexiable.
Received on 2007-12-14