Speed comparison wget vs. curl (HTTPS/HTTP1/GnuTLS)
Date: Wed, 29 Jun 2016 19:59:23 +0200
I recently made a few comparisons between curl 7.50.0-DEV and wget 1.18 and
was astonished about wget outperforming curl by some fair amount on single
HTTPS request/response cycles.
So my question goes... what is 'wrong' with that version of curl. Or what did
I oversee - maybe some special options ?
Some details from here:
Debian SID amd64, both wget and curl built/installed with GnuTLS (3.4.13)
Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
2MBit/s DSL, ping to www.google.com is ~106ms
Downloading a non-existent page at www.google.com (a.html) via HTTPS/HTTP1.1.
The figures from 'time' are the fastest I got in 10 tries.
$ time curl -s -o/dev/null https://www.google.com/a.html
$ time wget -q -o/dev/null --no-alpn https://www.google.com/a.html
Looks like there is some (pretty huge ~ 240ms) startup penalty hidden
somewhere. Building with OpenSSL makes not much difference.
BTW, the only possibility to disable HTTP2 was using --no-alpn. --no-http2 did
not switch if off.
$ curl --version
curl 7.50.0-DEV (x86_64-pc-linux-gnu) libcurl/7.50.0-DEV GnuTLS/3.4.13
zlib/1.2.8 libidn/1.32 libpsl/0.11.0 (+libicu/55.1) nghttp2/1.11.1
Protocols: dict file ftp ftps gopher http https imap imaps pop3 pop3s rtsp smb
smbs smtp smtps telnet tftp
Features: IDN IPv6 Largefile NTLM NTLM_WB SSL libz TLS-SRP HTTP2 UnixSockets
$ wget --version
GNU Wget 1.18 built on linux-gnu.
-cares +digest -gpgme +https +ipv6 +iri +large-file -metalink +nls
+ntlm +opie +psl +ssl/gnutls
- application/pgp-signature attachment: This is a digitally signed message part.