curl-users
Re: Post large file with --data-binary
Date: Thu, 15 Dec 2016 16:23:22 +0100 (CET)
On Wed, 14 Dec 2016, Stephen Paul Weber wrote:
> Today I tried to upload a 5GB file using --data-binary from a machine with
> 800MB RAM (virtual machine). I got this message:
>
> curl: option --data-binary: out of memory
>
> After a lot of googling I find explanations about why `-d` and `-F` do this
> (they have the build the formated request in memory)
-d does that, -F does not.
> and recommendations to use `--data-binary @large-file`
--data-binary does it the same way as -d works. They're mostly the same under
the good.
> Is there any way to do this with curl, or should I maybe be looking into
> something simpler (fixed HTTP header + netcat maybe?)
The --data options all read the file first and then send it. It could of
course be fixed if someone felt inclinded to write some new logic that acts
differently. Until then you have to use another tool or another method, like a
-F post perhaps.
-- / daniel.haxx.se ------------------------------------------------------------------- List admin: https://cool.haxx.se/list/listinfo/curl-users FAQ: https://curl.haxx.se/docs/faq.html Etiquette: https://curl.haxx.se/mail/etiquette.htmlReceived on 2016-12-15