Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curl does not limit output data size #11810

Closed
ElliotKillick opened this issue Sep 7, 2023 · 13 comments
Closed

Curl does not limit output data size #11810

ElliotKillick opened this issue Sep 7, 2023 · 13 comments

Comments

@ElliotKillick
Copy link

I did this

Consider the following: test="$(curl https://example.com)"

example.com could cause curl to download a very large amount of data which could lead to an out of memory scenario when put into the shell variable test.

A similar DoS could happen if writing to the disk by filling up disk space to its entirety.

Both of these could completely lock up a system and cause other applications to abort (or crash if they're written poorly). For memory exhaustion, a system with OOM reaper may also eventually take action.

I expected the following

Considering this is such a basic security issue, curl should have a built-in option to limit its output size from stdout/-o/--output. Once the hard limit for output size has been reached, curl should stop reading from the network and terminate with curl error 23 or probably better would be a new error code to indicate this exact error.

I propose one of --max-size (similar to --max-time), --max-output, or --max-out-size (similar to --max-filesize which doesn't serve as a hard limit) to stay aligned with option names already existing in curl.

curl/libcurl version

curl 8.0.1

operating system

Fedora Linux 38

@dfandrich
Copy link
Contributor

dfandrich commented Sep 7, 2023 via email

@ElliotKillick
Copy link
Author

ElliotKillick commented Sep 7, 2023

--max-filesize won't work as a hard limit due to the reason given in the curl manual NOTE for this option. Changing this option to also enforce a hard limit would be an acceptable solution.

For Unix systems, head -c mostly works. Although because it was only recently added to POSIX, not all Unix systems have this yet, for example, OpenBSD. dd would definitely work on all Unix systems in practice, however, due to how dd works it may come with a performance hit (not to mention it's not pretty).

I've tried piping into head -c, however, in POSIX sh it can't easily be done when you also want to check curl's error code. This is because head (the last command run in the pipeline) will conceal curl's error code. set -o pipefail can be used to propagate curl's error code but again, only recently added to POSIX and popular shells like Dash have yet to support it.

Moreover, I think this security feature should be doable in curl without requiring a shell. The above doesn't cover Windows of course.

@bagder
Copy link
Member

bagder commented Sep 7, 2023

Maybe we should make --max-filesize also work as a secondary check to stop "too much" data from being saved.

@jay
Copy link
Member

jay commented Sep 7, 2023

diff --git a/src/tool_cb_wrt.c b/src/tool_cb_wrt.c
index 2f8c6ac..e662a35 100644
--- a/src/tool_cb_wrt.c
+++ b/src/tool_cb_wrt.c
@@ -231,6 +231,10 @@ size_t tool_write_cb(char *buffer, size_t sz, size_t nmemb, void *userdata)
     }
   }
 
+  if(config->max_filesize && (outs->bytes + bytes > config->max_filesize)) {
+    return CURLE_FILESIZE_EXCEEDED;
+  }
+
 #ifdef WIN32
   fhnd = _get_osfhandle(fileno(outs->stream));
   /* if windows console then UTF-8 must be converted to UTF-16 */

It's documented to do nothing if the file size is not known but I think there's flexibility to remove that because of the way that option is used.

@bagder
Copy link
Member

bagder commented Sep 7, 2023

I think there's flexibility to remove that because of the way that option is used.

I think so too. Will you make a PR out this and we can give it a go?

jay added a commit to jay/curl that referenced this issue Sep 7, 2023
- Return CURLE_FILESIZE_EXCEEDED when the received data exceeds the
  maximum file size.

This is to handle those cases where the filesize is not known in advance
(eg chunked encoding) but the bytes received exceeds the max file size.

Prior to this change --max-filesize had no effect when the file size was
not provided by the server.

Reported-by: Elliot Killick

Fixes curl#11810
Closes #xxxx
bagder added a commit that referenced this issue Sep 8, 2023
Previously it would only stop them from getting started if the size is
known to be too big then.

Update the libcurl and curl docs accordingly.

Fixes #11810
Reported-by: Elliot Killick
@bagder bagder changed the title Curl does not limit output data size potentially leading to DoS Curl does not limit output data size Sep 11, 2023
bagder added a commit that referenced this issue Sep 21, 2023
Previously it would only stop them from getting started if the size is
known to be too big then.

Update the libcurl and curl docs accordingly.

Fixes #11810
Reported-by: Elliot Killick
@bagder bagder closed this as completed in 914e49b Sep 23, 2023
@ElliotKillick
Copy link
Author

Thank you curl team for the fast response and turnaround on this issue!

I find the description of this issue coincides with CWE-400. Do you think it's worth advising users with a minor CVE? Due to curl's popularity, there are probably cases (e.g. on an embedded device with few resources and no GNU coreutils/Bash) where an automated script could run into this weakness with no easily portable way to fix it on their side until now. Thanks so much again.

@dfandrich
Copy link
Contributor

dfandrich commented Sep 23, 2023 via email

ptitSeb pushed a commit to wasix-org/curl that referenced this issue Sep 25, 2023
Previously it would only stop them from getting started if the size is
known to be too big then.

Update the libcurl and curl docs accordingly.

Fixes curl#11810
Reported-by: Elliot Killick
Assisted-by: Jay Satiro
Closes curl#11820
@ElliotKillick
Copy link
Author

Would you consider it a security problem in the application then due to it using curl incorrectly?

Sure, there's always going to be other means in the form of some hack. However, the fact is that most people aren't going to do that and as a result write insecure applications.

@dfandrich
Copy link
Contributor

dfandrich commented Sep 27, 2023 via email

@ElliotKillick
Copy link
Author

From the libcurl-security document:

Instead, applications should monitor the amount of data received within the write or progress callback and abort once the limit is reached.

This document only provides a solution for libcurl not the curl CLI tool.

My point is that there was an unreasonably large expectation on developers using the curl command to come up with some hack for solving this issue due to none being documented and curl's popularity on many platforms. This is why I think a CVE makes sense. It would be beneficial to organizations using curl because I'm willing to bet that most scripts aren't doing this possibly at little to no fault of their own.

While curl may have been working as advertised, it didn't give developers the functionality needed to defend against the documented DoS threat, nor was there any robust way to get that functionality by piping into external programs in some cases.

@bagder
Copy link
Member

bagder commented Sep 27, 2023

Several curl team members have repeated the same message many times:

this is curl working as advertised

If this causes a security problem, then that problem is in the user's backyard and you should file a CVE for that product/service. It is not a security problem for curl itself.

Filing a CVE for curl working as intended and documented is not helpful to anyone.

@ElliotKillick
Copy link
Author

I believe there's been a misunderstanding. The fact that curl's default behavior is working as advertised was never the issue. The issue is that curl had no option to set a maximum output size if the developer desired it.

Furthermore, there is no robust (implemented POSIX) solution to the problem should a developer want to include this security control it in their script. It's like a known vulnerability with no patch or mitigation available for all platforms. Prior to this issue, if I were to raise this problem for another product/service as you recommended then I may not have had any patch that I could give the dev team to fix this issue.

If the curl tool is handling an untrusted URL, then this is a security problem because it affects availability, which is part of the CVSS framework.

Not having this security control always be possible is unacceptable because curl is often responsible for interacting with data from the most untrusted source there is - the Internet. Maybe that was acceptable at one point, but in 2023, it's not.

@bagder
Copy link
Member

bagder commented Oct 25, 2023

You can continue to repeat your stance, and we can continue to repeat ours. It's not very productive though.

ElliotKillick added a commit to ElliotKillick/Mido that referenced this issue Jan 7, 2024
- Patch DoS vulnerability where Microsoft servers could send us data
  forever until OOM or disk space fills up. This issue could not
  feasibly be patched until now:
  curl/curl#11810
- Improve handle_curl_error
- Update win11x64 checksum
- Organize assets into their own folder
- Add attribution and copyright in download functions
- Update copyright year
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
4 participants