Buy commercial curl support from WolfSSL. We help you work
out your issues, debug your libcurl applications, use the API, port to new
platforms, add new features and more. With a team lead by the curl founder
himself.
Re: How is --parallel handled when --proxy is set ?
- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]
From: Markur Sens via curl-users <curl-users_at_lists.haxx.se>
Date: Mon, 29 May 2023 12:16:12 +0300
> On 29 May 2023, at 12:02 PM, Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Mon, 29 May 2023, Markur Sens via curl-users wrote:
>
>> Can someone explain how the cli handles —parallel transfers along with —proxy is set ?
>
> I assume this means --parallel and --proxy etc (using two dashes, not one).
>
>> curl —proxy \
>
> The option --proxy takes an argument that is the proxy host name + port, and ideally also using a "scheme" that says what kind of proxy it is.
I know :-) I omitted some syntax details for clarity
>
>> —parallel \
>> —parallel-max 10
>> —retry \
>> —retry-delay 60 \
>> —retry
>>
>> Are the following assumptions correct?
>> - At any given moment there are max 10 conns to the proxy = end host. Is this correct ?
>
> No. It means that at any given moment there is a maximum of 10 *transfers* going on. If you are doing transfers to many different host names, curl will keep a number of old connections around for later possible connection reuse.
>
> So at any given moment there can be 10 connections alive for ongoing transfers plus N previously used connections that are still alive.
>
>> - The —retry applies to what host exactly?
>
> It applies to *the transfer*.
>
>> There are cases where the end host errors (e.g. connect refused or similar) or proxy errors (e.g. socks server timeouts / user refused etc.) . If the end host throw an error, is the proxy connection kept open, or the complete connection local-proxy-end host is reset completely ?
>
> It depends. curl tries to do "as good as possible" in those situations and not close down more connections than it needs to, because it might want to use that connection again for the next transfer.
Any ideas how can I make this “best effort” policy more “transparent” / predictable ?
Maybe pointer to the source code I can have a look at ?
Thanks a lot :-)
>
> --
>
> / daniel.haxx.se
> | Commercial curl support up to 24x7 is available!
> | Private help, bug fixes, support, ports, new features
> | https://curl.se/support.html
Date: Mon, 29 May 2023 12:16:12 +0300
> On 29 May 2023, at 12:02 PM, Daniel Stenberg <daniel_at_haxx.se> wrote:
>
> On Mon, 29 May 2023, Markur Sens via curl-users wrote:
>
>> Can someone explain how the cli handles —parallel transfers along with —proxy is set ?
>
> I assume this means --parallel and --proxy etc (using two dashes, not one).
>
>> curl —proxy \
>
> The option --proxy takes an argument that is the proxy host name + port, and ideally also using a "scheme" that says what kind of proxy it is.
I know :-) I omitted some syntax details for clarity
>
>> —parallel \
>> —parallel-max 10
>> —retry \
>> —retry-delay 60 \
>> —retry
>>
>> Are the following assumptions correct?
>> - At any given moment there are max 10 conns to the proxy = end host. Is this correct ?
>
> No. It means that at any given moment there is a maximum of 10 *transfers* going on. If you are doing transfers to many different host names, curl will keep a number of old connections around for later possible connection reuse.
>
> So at any given moment there can be 10 connections alive for ongoing transfers plus N previously used connections that are still alive.
>
>> - The —retry applies to what host exactly?
>
> It applies to *the transfer*.
>
>> There are cases where the end host errors (e.g. connect refused or similar) or proxy errors (e.g. socks server timeouts / user refused etc.) . If the end host throw an error, is the proxy connection kept open, or the complete connection local-proxy-end host is reset completely ?
>
> It depends. curl tries to do "as good as possible" in those situations and not close down more connections than it needs to, because it might want to use that connection again for the next transfer.
Any ideas how can I make this “best effort” policy more “transparent” / predictable ?
Maybe pointer to the source code I can have a look at ?
Thanks a lot :-)
>
> --
>
> / daniel.haxx.se
> | Commercial curl support up to 24x7 is available!
> | Private help, bug fixes, support, ports, new features
> | https://curl.se/support.html
-- Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users Etiquette: https://curl.se/mail/etiquette.htmlReceived on 2023-05-29