cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Is --ftp-retry a good idea?

From: Ralph Mitchell <ralphmitchell_at_gmail.com>
Date: Tue, 26 Oct 2004 07:07:14 -0500

Really, the script writers ought to be checking return codes and
writing their own loops, but that's in a perfect world... :)

If you're going to add -ftp-retry, would there be a good case for
adding -http-retry as well?? I could sure use it - I've had to make
some scripts loop up to five times attempting to fetch a page.

Would it be any harder to add "-retry N" in order to retry the
operation up to N times before giving up?? That would be less ftp
specific, but I don't know what other protocols would benefit from an
automatic retry.

Ralph Mitchell

On Tue, 26 Oct 2004 13:40:52 +0200 (CEST), Daniel Stenberg
<daniel-curl_at_haxx.se> wrote:
> Hi
>
> Sometimes when I'm bored, I look around the internet for various curl-related
> stuff that happens where I normally don't look.
>
> One area where I've found curl usually gets a lot of unjustified badmouthing,
> is for example when used by 'urpmi' (the Mandrake installer program) and
> similar programs.
>
> The reason for this is that the program uses curl or wget for downloading
> packages off the net. It doesn't do anything different in the program, it just
> invokes the specific download program to get a remote file.
>
> While this may look like a good idea to many people, it doesn't get the
> intended behaviour. The most typical case is when people attempt to download a
> file off a very busy FTP server: wget has an internal retry functionality that
> makes it retry ftp transfers if the server says it is temporarily busy, while
> curl has none and exits immediately on all kinds of errors (assuming that the
> one that use curl takes care of retrying or not).
>
> This makes curl look bad compared to wget in certain cases and I don't expect
> all users of these programs to fully grasp why this is so (and many probably
> don't care), and apparently the authors of these programs aren't caring enough
> to do much about it.
>
> I'm considering adding a new --ftp-retry option that would do exactly this: if
> the FTP server returns a trancient error when we attempt to login, we sleep a
> while and retry again later. We double the sleep time for each attempt and we
> do only a limited number of attempts.
>
> Is this a good idea? I honestly can't think of many drawbacks, other than it
> pollutes the code somewhat with some rather FTP-specific stuff and an extra
> loop.
>
> If it is a good idea, would we also need a --ftp-retry-count (to set maximum
> number of retries) and --ftp-retry-time (to set maximum time allowed for
> retries)?
>
> What do you think?
>
> --
> Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
> Dedicated custom curl help for hire: http://haxx.se/curl.html
>
Received on 2004-10-26