curl-library
>2 GB file support
Date: Tue, 12 Aug 2003 18:42:16 -0700
I had a brief conversation today in SourceForge's bug tracking
system with Daniel Stenberg about curl and >2GB file support
in libcurl. I was pointed at this list.
So, I have libcurl downloading >2GB files with the patch I'm
attaching to this mail. This is a preliminary patch that has
several caveats:
1) I have only tested http/https downloads. I haven't tested ftp,
although the changes should work for ftp as well.
2) This adds a new type "filesize_t", which I've simply #defined
to int64_t. This should be autoconf'd to something more reasonable.
3) This change is binary incompatible with the existing libcurl.
This changes the size of several libcurl structures, and the parameters
to several functions.
4) One problem with autoconf'ing the type of filesize_t, is that there
are many printf() and one sscanf() strings that need to be updated
depending on the size of filesize_t. This could be done with macros
or internal functions to generate the correct printf()/scanf() strings
depending on the size of filesize_t. I'm open to suggestions on how
best to implement that, for portability.
4) I haven't done extensive testing, but "works for me" =)
Anyway, I'd love to get comments/flames/suggestions on this patch.
Thanks,
Rob
-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
- text/plain attachment: curlbf.diff