cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: How to auto-retry a servered connection?

From: Doug McNutt <douglist_at_macnauchtan.com>
Date: Sun, 20 Jul 2008 13:38:03 -0600

At 14:12 -0400 7/20/08, jayjwa wrote:
>On Sat, 19 Jul 2008, PsiStormYamato_at_cs.com wrote:
>
>-> I see that there is an option to retry option (--retry xx) option, but I -> can't get it to work when an established connection has been severed, such as when -> my dailup connection is broken, and it returns error code 18: "Partial file. -> Only a part of the file was transferred.". (Maybe the retry option only -> applies to the initial connection attempt?) I want for cURL to keep retrying if the -> connection is severed, so that it will seamlessly resume downloading after the -> internet connection is re-established.
>-> -> I guess I could use a script that keeps looping the command until -> %errorlevel% is = 0, but I thought that surely there is a native function in cURL that -> already does what I need.
>-> -> The command I am trying to use is : " curl -C - -L -v -O --retry 999 -S -> "http://www.archive.org/download/usgs_drg_nc_35077_b1/o35077b1.tif" "
>
>
>I'd also like to add in here something along these lines. Frequently, I'll be downloading something large with curl. On my low-bandwidth connection, downloading something around 56mb takes a long time - hours. I'll do something like this:
>
>curl -O -v http://url/file
>
>Thinking it's all set, I'll leave, or go to another task. Hours later I return and find the download cut out, at maybe 1-2mb, and curl will say something to the effect of
>
>*Closing connection #0 55426193 bytes left to read.
>
>I'm writing that from memory, but hopefully you know the message I mean. The network connection is still there, only the download cut out for some reason.
>
>Then I'll restart it, curl -C- -O -v http://url/file, and it will do the same thing, but maybe get to
>
>*Closing connection #0 52421131 bytes left to read.
>
>The --retry option has no effect on this. I can't see a way to script this, as I believe curl exits successfully, (EXIT_SUCCESS) although the download is not successful. Other similar tools seem to automatically handle this condition, sending the request again until the full file is down.
>
>It might be a server-side issue, as it seems to happen more with some servers. I can't get it to do it with my Apache install, but my Apache is newer and it may be older versions of some servers are more likely to do this. Still, it would be great if curl automatically retried these connections, because I can't think of a situation where one would want a partially downloaded file.

I have has some luck using the -w option and returning the size actually downloaded. I use a perl script that calls the curl tool with backticks. I can then use a test in perl that will retry with a -C if I don't get what I expect.

Yeah, I know the approximate file size before I start and that might be a problem. Perl can also look for that output "*Closing connection #0 52421131 bytes left to read." but I haven't tried that. I donno if it's written to stdout or stderr but they can be redirected in the call.

It might also suffice to simply check the status as curl terminates.

-- 
--> From the U S of A, the only socialist country that refuses to admit it. <--
-------------------------------------------------------------------
List admin: http://cool.haxx.se/cgi-bin/mailman/listinfo/curl-users
FAQ:        http://curl.haxx.se/docs/faq.html
Etiquette:  http://curl.haxx.se/mail/etiquette.html
Received on 2008-07-20