curl-users
Re: --max-filesize and streams
Date: Sat, 10 Sep 2005 20:21:18 +0200 (CEST)
On Sat, 10 Sep 2005, Dave wrote:
>> But --retry is a fairly new addition to curl, it most probably needs some
>> tweaking still to become really good.
>
> It seems to me that retry should try to re-open a closed connection and
> continue the transfer.
Yes, I think so too. But if the remote size isn't known, it can't know if the
transfer was aborted prematurely or not. So when a stream transfer breaks, it
can't know that it should continue it. (Unless the --stream option is added
that is.)
> Thats why I am thinking that there should be an option to specify the number
> of bytes to download. It could keep retrying/restarting until it transfers
> the specified number of bytes or exceeds the specified number of retry's.
Yes, I can see why such an option would be useful.
>> I still believe the range option is what you want and unless it works
>> already, we could investigate why it doesn't.
>
> When I use the --range option it stops transferring and does not resume.
--range wouldn't resume, it would just make sure that you get a range of the
remote data.
> But it does not exit either. It continues counting off the time but no data
> is being transferred.
Odd. Can you show us a command line against a public URL that shows what
happens?
> ** Resuming transfer from byte position 192512
> % Total % Received % Xferd Average Speed Time Time Time
> Current
> Dload Upload Total Spent Left Speed
> 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:--
> 0
> curl: (33) HTTP server doesn't seem to support byte ranges. Cannot resume.
Hm, yes. I see why... Resuming is not gonna work fine on streams, since curl
will ask the server to restart the transfer at a given position while the
server is likely to not support it. Like this.
> Is there a reason why it only retries after transient errors and not after
> closed connections? It seems that cURL would be more robust if it could
> continue to retry/restart a transfer until the time or file size has been
> exceeded.
The reason is quite simply that I just made it so and no one has requested any
different behaviour or provided any corrections. In my narrow-minded world, I
don't consider broken transfers to be very frequent but the transient errors
can be and thus I thought those problems to be what --retry should work with.
But I'm indeed open to improving this area (too).
> I know downloading streams is a specialized use for cURL. Probably not the
> typical HTTP application. That is why I was thinking that a --Stream option
> would be useful. It would make it easier to get it to do things that are
> unique to downloading a stream. But maybe there is a combination of existing
> options that will already work.
Given some further thought, I don't think you can make it work with any
current options. It will most likely require a new option or two.
It could probably be an option that sets the limit to when a download is
stopped. And then --retry should be changed to continue broken downloads if
not the complete file has been transfered.
Then we'd need to think out how we'd know that a broken stream should be
continued, I mean compared to a download of static content that we just don't
know the size of.
So, I see a bit of work ahead of us to make this work! ;-)
-- Commercial curl and libcurl Technical Support: http://haxx.se/curl.htmlReceived on 2005-09-10