curl-library
Large transfers
Date: Thu, 31 Jul 2003 15:02:01 +0200
Attempting to download a large file (>2GB) has several issues with
curl, currently.
For example on very large files Apache often sends a bogus
Content-Length, which curl 7.10.6 fails to interpret correctly:
<= Recv header, 17 bytes (0x11)
0000: HTTP/1.1 200 OK
<= Recv header, 37 bytes (0x25)
0000: Date: Thu, 31 Jul 2003 12:24:08 GMT
<= Recv header, 70 bytes (0x46)
0000: Server: Apache/1.3.27 (Unix) DAV/1.0.3 mod_ssl/2.8.12 OpenSSL/0.
0040: 9.6g
<= Recv header, 46 bytes (0x2e)
0000: Last-Modified: Thu, 17 Jul 2003 18:10:55 GMT
<= Recv header, 34 bytes (0x22)
0000: ETag: "14c2ec-bf8532e8-3f16e6af"
<= Recv header, 22 bytes (0x16)
0000: Accept-Ranges: bytes
<= Recv header, 46 bytes (0x2e)
0000: Last-Modified: Thu, 17 Jul 2003 18:10:55 GMT
<= Recv header, 34 bytes (0x22)
0000: ETag: "14c2ec-bf8532e8-3f16e6af"
<= Recv header, 22 bytes (0x16)
0000: Accept-Ranges: bytes
<= Recv header, 29 bytes (0x1d)
0000: Content-Length: -1081789720
<= Recv header, 26 bytes (0x1a)
0000: Content-Type: text/plain
<= Recv header, 26 bytes (0x1a)
0000: Content-Encoding: x-gzip
<= Recv data, 1139 bytes (0x473)
curl: (18) transfer closed with -1081789720 bytes remaining to read
This appears to have been fixed in Apache only very recently:
http://nagoya.apache.org/bugzilla/show_bug.cgi?id=21323
I suppose curl should ignore a Content-Length < 0, attempt resuming
only at positions smaller than 2GB and just streaming to the end of
file and crossing fingers otherwise (wget apparently does something
like this, uh, sorry for using the w-word).
I tried looking into developing a patch but stepped in this in
lib/transfer.c:581:
sscanf (k->p+15, " %ld", &k->contentlength))
Obviously also curl needs upgrading to 64bit file lengths/offsets, and
if you use sscanf you need different % qualifiers on different
platforms (printfing usually is %lld on unixish systems, %I64d on
VC6/7, scanfing '%qd' on unixish systems and %I64d on VC6/7 -- that's
capital 'i' followed by 64).
Unfortunately given the number of different things that need to be
hacked I don't currently have the time to work on it, its not a major
issue but I thought it was worth noting, hopefully someone can develop
a patch quickly starting from my notes.
Duncan
-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
Received on 2003-07-31