curl-library
dealing with broken FTP servers
Date: Sun, 22 Jan 2012 11:37:57 -0500
Hi all,
I have a curl powered application that has to deal with a large number
of FTP and HTTP servers. The FTP side of this has proven to be extremely
problematic, though no fault of curl. The app supports resuming of
downloads through usage of temporary files.
Lately, I've been finding that some FTP servers will timeout on the
connection close [1], resulting in curl (rightfully) returning an error
from the operation, even though the file is at its full size. Currently,
the application logic says that a timeout is a failure and will not
rename the temporary file to the proper destination filename that the
app expects to read from. This becomes problematic later on in the app
for reasons that aren't relevant. I'll mention out that the app _does_
sometimes know ahead of time the expected size of the file, but this
isn't always the case.
My question is: what's the best way to deal with this? A quick and dirty
method might be to compare the values of the content-length header with
the actual amount of data that was downloaded (obtainable via
curl_easy_getinfo) when curl returns an operation timeout. Does this
seem reasonable? Is there a simpler or better method that I'm
overlooking?
I'd rather just ban FTP everywhere, but something tells me I need to
figure out a graceful way of dealing with this kind of crap.
Cheers,
Dave
[1] https://bbs.archlinux.org/viewtopic.php?pid=1045345#p1045345
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2012-01-22