curl-library
libcurl error 56 caused by SSL error 10054
Date: Thu, 5 Jan 2012 07:19:45 -0500
I have implemented an ftp get into my application using libcurl. It uses
wildcard matching so it has the potential to have to download hundreds of
files with one libcurl call. I am finding that for one particular fileset I
am testing it on it downloads close to 300 files of about 2KB each and then
I get a libcurl error 56. When I check the libcurl log file it has the
following:
* Connection #0 to host myhost.com left intact
* Re-using existing connection! (#0) with host myhost.com
* Connected to myhost.com (x.x.x.x) port 21 (#0)
* Wildcard - START of "ShipAck1246-25765.xml"
* Request has same path as previous transfer
> EPSV
* Connect data stream passively
< 229 Entering Extended Passive Mode (|||33071|)
* Trying x.x.x.x... * connected
* Connecting to x.x.x.x (x.x.x.x) port 33071
> RETR ShipAck1246-25765.xml
< 125 Data connection already open; Transfer starting.
* Doing the SSL/TLS handshake on the data stream
* SSL re-using session ID
* SSL connection using AES128-SHA
* Server certificate:
* subject: C=US; ST=Connecticut; L=Rocky Hill; O=Software
Marketing Associates; CN=*. myhost.com
* start date: 2010-10-14 00:00:00 GMT
* expire date: 2012-12-12 23:59:59 GMT
* issuer: C=US; O=Thawte, Inc.; CN=Thawte SSL CA
* SSL certificate verify result: unable to get local issuer
certificate (20), continuing anyway.
* Maxdownload = -1
* Getting file with size: 1872
* Remembering we are in dir "myfolder/"
* SSL read: error:00000000:lib(0):func(0):reason(0), errno 10054
* Connection #0 to myhost.com left intact
* Failure when receiving data from the peer
* SSL_write() returned SYSCALL, errno = 10054
* Closing connection #0
I found one other mention of this error on the mailing list, but in that
case it was for a php based application, where it seemed the solution was
that php was cutting it off because it was taking too long to run the
script. In my case, I am calling libcurl from a windows application so it
is not cutting it off. I saw in that post that they adjusted:
- CURLOPT_CONNECTTIMEOUT, which according to the documentation only
limits the time it takes to connect and it my case it is getting connected,
just not receiving the file. Plus it times out after a total connection
time of about 80 seconds, where the default for this (which I am by default
using) is 300 seconds.
- CURLOPT_DNS_CACHE_TIMEOUT, which as I understand it only affects
how long the ip addresses looked up for the host names are stored
- CURLOPT_TIMEOUT, which as I understand it as long as you don't
set this option, libcurl will never time out
Any help you can offer is appreciated.
Thanks,
David
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2012-01-05