New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
curl errors after too many redirects #11871
Comments
That's not a "crash", it returns an error. |
Regression from 3ee79c1 |
Not the counter that accumulates all headers over all redirects. Yes: this means that if you allow following unbounded redirects in never-ending loops, curl can run out of memory. Fixes #11871 Reported-by: Joshix-1 on github
It will not do that. curl keeps headers in memory for easy access after the transfer is done, which thus implicitly limits how many times it can loop when following redirects. |
Why does it need all the headers? If it needs them, then that's fine with me, I just think it would need an error message mentioning that. (or an option to not keep all the headers in memory, if that isn't too much work for such a pointless thing [following redirects forever]) |
Your pointless is someone else's best feature. And vice versa. No, it cannot be asked to not store the headers. |
Not the counter that accumulates all headers over all redirects. Follow-up to 3ee79c1 Do a second check for 20 times the limit for the accumulated size for all headers. Fixes curl#11871 Reported-by: Joshix-1 on github Closes curl#11872
I did this
the command
outputs the following (shortened):
On a personal project with bigger response headers it exited more quickly (after less redirects). So I'm guessing that the size counter doesn't reset after a redirect.
I expected the following
--max-redirs -1
should work correctly. It should follow pointless redirects forever.curl/libcurl version
could not reproduce on
7.76.1
operating system
Arch Linux
uname -a
The text was updated successfully, but these errors were encountered: