curl / Mailing Lists / curl-users / Single Mail
Buy commercial curl support. We help you work out your issues, debug your libcurl applications, use the API, port to new platforms, add new features and more. With a team lead by the curl founder Daniel himself.

Re: Feature request: data stream upload after server response

From: Daniel Stenberg via curl-users <curl-users_at_lists.haxx.se>
Date: Mon, 10 Jun 2024 16:18:31 +0200 (CEST)

On Mon, 10 Jun 2024, fungs via curl-users wrote:

> When first listening for a status 200 response before sending the actual
> payload, things work faster and more predictable in those cases.

First: waiting for an HTTP response code before sending the data is violating
the protocol simply by assuming that it can do it. An HTTP server does not
need to respond anything at all until the entire request has been sent.

If we want a response before sending data, the general way is to send the
"Expect: 100-continue" header. Although admittedly this is not as widely
supported by server end points as we would like.

> At least for chunked data, this approach looks perfectly suitable to me from
> a practical point of view. Even if used as a replacement in the standard
> upload flow, it would only add a tiny bit of latency.

That's just your guess though. In N percent of the cases there is no response
before the data has started to get transmitted so you would have to have a
timeout for it. And thus such a timeout would hurt N percent of users.

This, because some percent of (bad) servers are slow when the clients send
data "too early".

Does sending the Expect: header fix your use case? If not, what happens that
makes it not work?

-- 
  / daniel.haxx.se
  | Commercial curl support up to 24x7 is available!
  | Private help, bug fixes, support, ports, new features
  | https://curl.se/support.html
-- 
Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-users
Etiquette:   https://curl.se/mail/etiquette.html
Received on 2024-06-10