Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curl 7.88.X segmentation fault with -D/--dump-header when it fails to create the file #10570

Closed
jrabasco opened this issue Feb 20, 2023 · 2 comments

Comments

@jrabasco
Copy link

jrabasco commented Feb 20, 2023

Steps to reproduce

If you try to use the -D/--dump-header <filename> option, curl curl will have a segmentation fault
if the file cannot be created (for instance if the target directory does not have the w permissions), for
instance:

$ mkdir hdr
$ chmod -w hdr
$ curl -D hdr/headers.dump https://google.com
Segmentation fault (core dumped)

I expected the following

On older versions of curl, it just prints an error message and exits with rcode 23
(the below is what happens with version 7.29.0):

$ /usr/bin/curl -D hdr/headers.dump https://google.com
Warning: Failed to open hdr/headers.dump
$ echo $?
23

curl/libcurl version + operationg systems

I was able to reproduce with the following two setups:

  • RHEL 7.9 with curl version 2.88.0:
    $ uname -a
    Linux [redacted] 3.10.0-[redacted] #1 SMP [redacted] x86_64 x86_64 x86_64 GNU/Linux
    $ curl -V
    curl 7.88.0 (x86_64-unknown-linux-gnu) libcurl/7.88.0 OpenSSL/1.0.2zg zlib/1.2.13 libssh2/1.9.0
    Release-Date: 2023-02-15
    Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp scp sftp smb smbs smtp smtps telnet tftp
    Features: alt-svc AsynchDNS GSS-API HSTS HTTPS-proxy IPv6 Kerberos Largefile libz NTLM NTLM_WB SPNEGO SSL threadsafe TLS SRP UnixSockets
    
  • Ubuntu 20.04 on WSL with curl version 2.88.1 (built from source):
    $ uname -a
    Linux [redacted] 5.10.16.3-microsoft-standard-WSL2 #1 SMP Fri Apr 2 22:23:49 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
    $ curl -V 
    curl 7.88.1 (x86_64-pc-linux-gnu) libcurl/7.88.1 OpenSSL/1.1.1f zlib/1.2.11 libidn2/2.2.0
    Release-Date: 2023-02-20
    Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp
    Features: alt-svc AsynchDNS HSTS HTTPS-proxy IDN IPv6 Largefile libz NTLM NTLM_WB SSL threadsafe TLS-SRP UnixSockets
    

Note that I was NOT able to reproduce this using the windows binary (downloaded from the curl website)

@adamncasey
Copy link

adamncasey commented Feb 20, 2023

$ valgrind ./curl -D hdr/headers.dump https://www.google.com
==10952== Memcheck, a memory error detector
==10952== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al.
==10952== Using Valgrind-3.15.0 and LibVEX; rerun with -h for copyright info
==10952== Command: ./curl -D hdr/headers.dump https://www.google.com
==10952==
==10952== Invalid read of size 4
==10952==    at 0x531D0D4: fclose <libc.so>
==10952==    by 0x40E56F: single_transfer (tool_operate.c:987)
==10952==    by 0x411E9D: transfer_per_config (tool_operate.c:2607)
==10952==    by 0x411E9D: create_transfer (tool_operate.c:2623)
==10952==    by 0x412894: serial_transfers (tool_operate.c:2433)
==10952==    by 0x412894: run_all_transfers (tool_operate.c:2648)
==10952==    by 0x412894: operate (tool_operate.c:2762)
==10952==    by 0x4038B3: main (tool_main.c:283)
==10952==  Address 0x0 is not stack'd, malloc'd or (recently) free'd

curl/src/tool_operate.c

Lines 985 to 989 in cbf5717

if(!per->prev || per->prev->config != config) {
newfile = fopen(config->headerfile, "wb+");
fclose(newfile);
}
newfile = fopen(config->headerfile, "ab+");
Looks like the issue.

One of us can submit a PR in the next day or so

@jrabasco
Copy link
Author

Thanks for the quick response!

bch pushed a commit to bch/curl that referenced this issue Jul 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging a pull request may close this issue.

3 participants