curl-library
Re: Download Averaging Issue
Date: Fri, 18 Nov 2011 18:08:36 +0000
On 18 November 2011 12:26, Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Fri, 18 Nov 2011, Rob Ward wrote:
>
>> As it seems no one has any information with regards to this I am going
to begin looking at implementing the first proposed solution as time
permits in order to see if it is feasible and produces better results. I
will post a patch as soon as I have any potentially working solution for
your review/comment.
>
>
> Sorry, I'm quite backlogged... I think the "issue" to solve is to have
the algorithm not use the entire transfer time as basis when it checks the
transfer speed, but instead perhaps the last N seconds. libcurl already
keeps track of the speed over the last 5 seconds and that might be a start.
>
> --
>
> / daniel.haxx.se
>
Hi Daniel,
Thanks for that, I understand that you have being very busy as of late
especially with the two releases as well. I saw that curl was stored the
data of the average for the last 5 seconds and looked at using this as a
potential solution. However if the download speed was 10meg (max out the
connection) and after this point the max speed was set to 1meg then it
would be 6 seconds(if I did my maths right) before the download would
resume, as such for a streaming system where a minimum download rate is is
needed constantly for media playback I can see using an average(even over
only a few seconds) where a relatively large change in the desired download
rate it required would/could cause issues.
Does this make sense in anyway(I could be confusing the issue for myself)?
Thanks for your help,
Cheers,
Rob
-- ------------------------------ Rob Ward www.rob-ward.co.uk
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2011-11-18