Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CURLOPT_MAX_RECV_SPEED_LARGE doesn't work when performing multiple transfers #4084

Closed
Ironbars13 opened this issue Jul 1, 2019 · 2 comments
Assignees

Comments

@Ironbars13
Copy link

Ironbars13 commented Jul 1, 2019

When downloading multiple files in a row without resetting the CURL options and limiting download speed using CURLOPT_MAX_RECV_SPEED_LARGE, only the first file downloaded will be rate-limited. Subsequent files will download as fast as possible.

After checking this on a few versions, doing some printf debugging, and scanning through recent pull requests, it seems to be the issue was introduced in 7.60 with this commit. Curl_pgrsLimitWaitTime() uses progress.downloaded and progress.dl_limit_size with Curl_ratelimit(), but progress.dl_limit_size doesn't get reset after the transfer ends, so when the second file is downloaded, Curl_pgrsLimitWaitTime() ends up calculating weird negative values that clearly seem unintended and the function returns 0, so no rate limiting occurs.

If I add

data->progress.downloaded = 0;
data->progress.uploaded = 0;

to Curl_pgrsStartNow(), then the problem goes away because the call to Curl_ratelimit then resets dl_limit_size, but I'm not familiar enough with the rest of the codebase to know if this is the best way to solve the problem or whether or not it could cause other problems.

Problem also exists when using CURLOPT_MAX_SEND_SPEED_LARGE in the same way.

@bagder
Copy link
Member

bagder commented Jul 2, 2019

Which libcurl version did you experience this with? Do you happen to have test code that can reproduce?

@Ironbars13
Copy link
Author

I noticed it on version 7.64.1, then tested it on a few other versions. On 7.59 it worked properly and on 7.60 it did not.

. . .

CURL* cHandle = curl_easy_init();
CURLcode cResult;
FileInfo fileInfo;

curl_easy_setopt(cHandle, CURLOPT_URL, "http://mywebsite.com/somefile.txt");
curl_easy_setopt(cHandle, CURLOPT_WRITEFUNCTION, WriteCallback);
curl_easy_setopt(cHandle, CURLOPT_WRITEDATA, fileInfo);
curl_easy_setopt(cHandle, CURLOPT_MAX_RECV_SPEED_LARGE, 37000);

for (int i = 0; i < 5; ++i)
{
    auto start = std::chrono::steady_clock::now();
    cResult = curl_easy_perform(cHandle);
    auto end = std::chrono::steady_clock::now();
    auto ms = (std::chrono::duration_cast<std::chrono::milliseconds(end - start)).count();
    std::cout << "MS: " << ms << std::endl;
    std::cout << "Curl result: " << curl_easy_strerror(cResult) << std::endl;
}

curl_easy_cleanup(cHandle);

The file I'm downloading is 226KB, so I would expect it to take about six seconds at 37KB/sec.

Output:

MS: 6145
Curl result: No error
MS: 15
Curl result: No error
MS: 1
Curl result: No error
MS: 1
Curl result: No error
MS: 1
Curl result: No error

You can also extend the loop around the calls to set the options; it's the same result.

@bagder bagder self-assigned this Jul 29, 2019
bagder added a commit that referenced this issue Jul 29, 2019
... to make CURLOPT_MAX_RECV_SPEED_LARGE and
CURLOPT_MAX_SEND_SPEED_LARGE work correctly on subsequent transfers that
reuse the same handle.

Fixed-by: Ironbars13 on github
Fixes #4084
@bagder bagder closed this as completed in d23e87d Jul 29, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Oct 27, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Development

Successfully merging a pull request may close this issue.

2 participants