cURL / Mailing Lists / curl-library / Single Mail

curl-library

Handling broken ftp REST over 2 GB

From: Dave Meyer <meyer_at_paracel.com>
Date: Tue, 9 Dec 2003 17:11:47 -0800 (PST)

Howdy,

I'm attempting to use curl to download a large (~2.5 GB) file, and as
often happens with long transfers like this, it frequently fails to
download the entire file. Thus, I'm using the resume option (-C -
on the command line) to continue downloading the file from where it
last left off.

The ftp server that I'm communicating with supports the REST command,
although it has slightly broken support (in my opinion), in that when
curl asks for REST with a file offset larger than 2 GB, the server
responds saying that's OK, but that it's going to resume the download
starting at 2^32 - 1 bytes into the file, regardless of what offset
larger than 2 GB I use. Here I should note that I'm using a patched
version of curl which I've modified to support files larger than 2 GB,
but that I've witnessed the same behavior with other ftp clients
connecting to this server.

This is a brief excerpt from the transcript of talking to the ftp server
in question (which is ftp.ncbi.nih.gov, by the way, and I'm getting
/blast/db/FASTA/nt.gz from there):

* FTP RANGE 2825135000 to end of file
* range-download from 2825135000 to -1, totally -1 bytes
> TYPE I
< 200 Type set to I
> SIZE nt.gz
< 213 2825135408
* Instructs server to resume from offset 2825135000
> REST 2825135000
< 350 Restarting at 2147483647. Send STORE or RETRIEVE to initiate
transfer
> RETR nt.gz
< 150 Opening BINARY mode data connection for nt.gz (677651761 bytes)
* Getting file with size: 408
** Resuming transfer from byte position 2825135000

From my browsing of the code, it doesn't appear that curl pays attention
to the actual response information from the ftp server for the REST
command -- it only watches the response code. If I'm missing something,
I would appreciate a prod in the right direction for handling this. If
not, I was wondering whether it would make sense to work on something to
support broken ftp servers like this. Certainly if I restart at the 2 GB
boundary I can get the rest of the file (assuming my transfer isn't
interrupted), so it seems like this might be a useful thing for dealing
with broken servers, rather than just detecting the condition or saying
that REST isn't supported beyond 2 GB...

Thoughts?

Thanks,

Dave

-------------------------------------------------------
This SF.net email is sponsored by: IBM Linux Tutorials.
Become an expert in LINUX or just sharpen your skills. Sign up for IBM's
Free Linux Tutorials. Learn everything from the bash shell to sys admin.
Click now! http://ads.osdn.com/?ad_id=1278&alloc_id=3371&op=click
Received on 2003-12-10