Buy commercial curl support from WolfSSL. We help you work
out your issues, debug your libcurl applications, use the API, port to new
platforms, add new features and more. With a team lead by the curl founder
himself.
Re: Slow performance using multi_curl_* with HTTPs requests
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Thomas Stähle via curl-and-php <curl-and-php_at_cool.haxx.se>
Date: Thu, 24 Jun 2021 16:33:55 +0200
>
> There's nothing I can do or advice without a lot of more details on what
> you're doing and how and why you say 102 seconds is max.
>
Tell me what you want to know. I don't know what information you need to
know.
I have an API which does, at once, several HTTP requests asking different
APIs for information and merges this information into one response to
produce its own response. There is nothing more. Before the upgrade we had
a certain response time. After the upgrade the response time was around
40ms slower. With the tests I was able to boil it down to the HTTP requests
the API does. Comparing the script runtime of the tests on the old node
with the one from the new node there was a raise by 63ms per request (6,39
seconds in total for the whole test script).
The 102 seconds is just a rough guess. It makes sense that doing 10
requests adds some more overhead than doing just 100 single requests. We
can see that overhead comparing both test runs just calling HTTP urls
(100.13s to 100.19s). The overhead doing HTTPS calls from 101.89s to
106.07s is just way too much when we expect that everything regarding the
10 parallel requests happens really parallel. The old node has a difference
of around 1s if you compare HTTP to HTTPS (103.15s to 104.17s).
> The biggest curiosity might be how your multi program compares to the
> "plain"
> one.
Yes, exactly. The loop with 100 iterations is the same in both scripts. It
simply does not make sense that the plain script ist so much faster
compared to the multi one which just does 10 requests in parallel 100 times.
_______________________________________________
https://cool.haxx.se/cgi-bin/mailman/listinfo/curl-and-php
Received on 2021-06-24
Date: Thu, 24 Jun 2021 16:33:55 +0200
>
> There's nothing I can do or advice without a lot of more details on what
> you're doing and how and why you say 102 seconds is max.
>
Tell me what you want to know. I don't know what information you need to
know.
I have an API which does, at once, several HTTP requests asking different
APIs for information and merges this information into one response to
produce its own response. There is nothing more. Before the upgrade we had
a certain response time. After the upgrade the response time was around
40ms slower. With the tests I was able to boil it down to the HTTP requests
the API does. Comparing the script runtime of the tests on the old node
with the one from the new node there was a raise by 63ms per request (6,39
seconds in total for the whole test script).
The 102 seconds is just a rough guess. It makes sense that doing 10
requests adds some more overhead than doing just 100 single requests. We
can see that overhead comparing both test runs just calling HTTP urls
(100.13s to 100.19s). The overhead doing HTTPS calls from 101.89s to
106.07s is just way too much when we expect that everything regarding the
10 parallel requests happens really parallel. The old node has a difference
of around 1s if you compare HTTP to HTTPS (103.15s to 104.17s).
> The biggest curiosity might be how your multi program compares to the
> "plain"
> one.
Yes, exactly. The loop with 100 iterations is the same in both scripts. It
simply does not make sense that the plain script ist so much faster
compared to the multi one which just does 10 requests in parallel 100 times.
_______________________________________________
https://cool.haxx.se/cgi-bin/mailman/listinfo/curl-and-php
Received on 2021-06-24