# curl man page generator
-This is the curl man page generator. It generates a single nroff man page
+`managen` is the curl man page generator. It generates a single nroff man page
output from the set of sources files in this directory.
The `mainpage.idx` file lists all files that are rendered in that order to
Uploading contents to an SMTP server means sending an email. With or without
TLS.
## TELNET
-Telling curl to fetch a telnet URL starts an interactive session where it
-sends what it reads on stdin and outputs what the server sends it.
+Fetching a telnet URL starts an interactive session where it sends what it
+reads on stdin and outputs what the server sends it.
## TFTP
curl can do TFTP downloads and uploads.
# `--alt-svc`
-This option enables the alt-svc parser in curl. If the filename points to an
-existing alt-svc cache file, that gets used. After a completed transfer, the
-cache is saved to the filename again if it has been modified.
+Enable the alt-svc parser. If the filename points to an existing alt-svc cache
+file, that gets used. After a completed transfer, the cache is saved to the
+filename again if it has been modified.
Specify a "" filename (zero length) to avoid loading/saving and make curl just
handle the cache in memory.
# `--anyauth`
-Tells curl to figure out authentication method by itself, and use the most
-secure one the remote site claims to support. This is done by first doing a
-request and checking the response-headers, thus possibly inducing an extra
-network round-trip. This is used instead of setting a specific authentication
+Figure out authentication method automatically, and use the most secure one
+the remote site claims to support. This is done by first doing a request and
+checking the response-headers, thus possibly inducing an extra network
+round-trip. This option is used instead of setting a specific authentication
method, which you can do with --basic, --digest, --ntlm, and --negotiate.
Using --anyauth is not recommended if you do uploads from stdin, since it may
# `--basic`
-Tells curl to use HTTP Basic authentication with the remote host. This is the
-default and this option is usually pointless, unless you use it to override a
+Use HTTP Basic authentication with the remote host. This method is the default
+and this option is usually pointless, unless you use it to override a
previously set option that sets a different authentication method (such as
--ntlm, --digest, or --negotiate).
# `--ca-native`
-Tells curl to use the CA store from the native operating system to verify the
-peer. By default, curl otherwise uses a CA store provided in a single file or
-directory, but when using this option it interfaces the operating system's
-own vault.
+Use the CA store from the native operating system to verify the peer. By
+default, curl otherwise uses a CA store provided in a single file or
+directory, but when using this option it interfaces the operating system's own
+vault.
This option works for curl on Windows when built to use OpenSSL, wolfSSL
(added in 8.3.0) or GnuTLS (added in 8.5.0). When curl on Windows is built to
# `--cacert`
-Tells curl to use the specified certificate file to verify the peer. The file
-may contain multiple CA certificates. The certificate(s) must be in PEM
-format. Normally curl is built to use a default file for this, so this option
-is typically used to alter that default file.
+Use the specified certificate file to verify the peer. The file may contain
+multiple CA certificates. The certificate(s) must be in PEM format. Normally
+curl is built to use a default file for this, so this option is typically used
+to alter that default file.
curl recognizes the environment variable named 'CURL_CA_BUNDLE' if it is set
and the TLS backend is not Schannel, and uses the given path as a path to a CA
# `--capath`
-Tells curl to use the specified certificate directory to verify the
-peer. Multiple paths can be provided by separating them with ":" (e.g.
-"path1:path2:path3"). The certificates must be in PEM format, and if curl is
-built against OpenSSL, the directory must have been processed using the
-c_rehash utility supplied with OpenSSL. Using --capath can allow
-OpenSSL-powered curl to make SSL-connections much more efficiently than using
---cacert if the --cacert file contains many CA certificates.
+Use the specified certificate directory to verify the peer. Multiple paths can
+be provided by separated with colon (`:`) (e.g. `path1:path2:path3`). The
+certificates must be in PEM format, and if curl is built against OpenSSL, the
+directory must have been processed using the c_rehash utility supplied with
+OpenSSL. Using --capath can allow OpenSSL-powered curl to make SSL-connections
+much more efficiently than using --cacert if the --cacert file contains many
+CA certificates.
If this option is set, the default capath value is ignored.
# `--cert-status`
-Tells curl to verify the status of the server certificate by using the
-Certificate Status Request (aka. OCSP stapling) TLS extension.
+Verify the status of the server certificate by using the Certificate Status
+Request (aka. OCSP stapling) TLS extension.
If this option is enabled and the server sends an invalid (e.g. expired)
response, if the response suggests that the server certificate has been
revoked, or no response at all is received, the verification fails.
-This is currently only implemented in the OpenSSL and GnuTLS backends.
+This support is currently only implemented in the OpenSSL and GnuTLS backends.
# `--cert-type`
-Tells curl what type the provided client certificate is using. PEM, DER, ENG
-and P12 are recognized types.
+Set type of the provided client certificate. PEM, DER, ENG and P12 are
+recognized types.
The default type depends on the TLS backend and is usually PEM, however for
Secure Transport and Schannel it is P12. If --cert is a pkcs11: URI then ENG is
# `--cert`
-Tells curl to use the specified client certificate file when getting a file
-with HTTPS, FTPS or another SSL-based protocol. The certificate must be in
-PKCS#12 format if using Secure Transport, or PEM format if using any other
-engine. If the optional password is not specified, it is queried for on
-the terminal. Note that this option assumes a certificate file that is the
-private key and the client certificate concatenated. See --cert and --key to
-specify them independently.
+Use the specified client certificate file when getting a file with HTTPS, FTPS
+or another SSL-based protocol. The certificate must be in PKCS#12 format if
+using Secure Transport, or PEM format if using any other engine. If the
+optional password is not specified, it is queried for on the terminal. Note
+that this option assumes a certificate file that is the private key and the
+client certificate concatenated. See --cert and --key to specify them
+independently.
In the \<certificate\> portion of the argument, you must escape the character
`:` as `\:` so that it is not recognized as the password delimiter. Similarly,
# `--compressed-ssh`
-Enables built-in SSH compression.
-This is a request, not an order; the server may or may not do it.
+Enables built-in SSH compression. This is a request, not an order; the server
+may or may not do it.
Only write one option per physical line in the config file. A single line is
required to be no more than 10 megabytes (since 8.2.0).
-Specify the filename to --config as '-' to make curl read the file from stdin.
+Specify the filename to --config as minus "-" to make curl read the file from
+stdin.
Note that to be able to specify a URL in the config file, you need to specify
it using the --url option, and not by simply writing the URL on its own
# `--connect-to`
-For a request to the given `HOST1:PORT1` pair, connect to `HOST2:PORT2`
-instead. This option is suitable to direct requests at a specific server,
-e.g. at a specific cluster node in a cluster of servers. This option is only
-used to establish the network connection. It does NOT affect the hostname/port
-that is used for TLS/SSL (e.g. SNI, certificate verification) or for the
-application protocols. `HOST1` and `PORT1` may be the empty string, meaning
-"any host/port". `HOST2` and `PORT2` may also be the empty string, meaning
-"use the request's original host/port".
+For a request intended for the `HOST1:PORT1` pair, connect to `HOST2:PORT2`
+instead. This option is only used to establish the network connection. It does
+NOT affect the hostname/port number that is used for TLS/SSL (e.g. SNI,
+certificate verification) or for the application protocols.
+
+`HOST1` and `PORT1` may be empty strings, meaning any host or any port number.
+`HOST2` and `PORT2` may also be empty strings, meaning use the request's
+original hostname and port number.
A hostname specified to this option is compared as a string, so it needs to
match the name used in request URL. It can be either numerical such as
# `--continue-at`
-Continue/Resume a previous file transfer at the given offset. The given offset
-is the exact number of bytes that are skipped, counting from the beginning
-of the source file before it is transferred to the destination. If used with
-uploads, the FTP server command SIZE is not used by curl.
+Resume a previous transfer from the given byte offset. The given offset is the
+exact number of bytes that are skipped, counting from the beginning of the
+source file before it is transferred to the destination. If used with uploads,
+the FTP server command SIZE is not used by curl.
-Use "-C -" to tell curl to automatically find out where/how to resume the
+Use "-C -" to instruct curl to automatically find out where/how to resume the
transfer. It then uses the given output/input files to figure that out.
Specify to which file you want curl to write all cookies after a completed
operation. Curl writes all cookies from its in-memory cookie storage to the
-given file at the end of operations. If no cookies are known, no data is
-written. The file is created using the Netscape cookie file format. If you set
-the filename to a single dash, "-", the cookies are written to stdout.
+given file at the end of operations. Even if no cookies are known, a file is
+created so that it removes any formerly existing cookies from the file. The
+file uses the Netscape cookie file format. If you set the filename to a single
+minus, "-", the cookies are written to stdout.
The file specified with --cookie-jar is only used for output. No cookies are
read from the file. To read cookies, use the --cookie option. Both options
# `--cookie`
Pass the data to the HTTP server in the Cookie header. It is supposedly the
-data previously received from the server in a "Set-Cookie:" line. The data
-should be in the format "NAME1=VALUE1; NAME2=VALUE2". This makes curl use the
+data previously received from the server in a `Set-Cookie:` line. The data
+should be in the format `NAME1=VALUE1; NAME2=VALUE2` or as a single filename.
+
+When given a set of specific cookies and not a filename, it makes curl use the
cookie header with this content explicitly in all outgoing request(s). If
multiple requests are done due to authentication, followed redirects or
-similar, they all get this cookie passed on.
+similar, they all get this cookie header passed on.
-If no '=' symbol is used in the argument, it is instead treated as a filename
+If no `=` symbol is used in the argument, it is instead treated as a filename
to read previously stored cookie from. This option also activates the cookie
engine which makes curl record incoming cookies, which may be handy if you are
using this in combination with the --location option or do multiple URL
transfers on the same invoke.
-If the filename is exactly a minus ("-"), curl instead reads the contents from
-stdin. If the filename is an empty string ("") and is the only cookie input,
-curl activates the cookie engine without any cookies.
+If the filename is a single minus ("-"), curl reads the contents from stdin.
+If the filename is an empty string ("") and is the only cookie input, curl
+activates the cookie engine without any cookies.
The file format of the file to read cookies from should be plain HTTP headers
(Set-Cookie style) or the Netscape/Mozilla cookie file format.
The file specified with --cookie is only used as input. No cookies are written
-to the file. To store cookies, use the --cookie-jar option.
+to that file. To store cookies, use the --cookie-jar option.
If you use the Set-Cookie file format and do not specify a domain then the
cookie is not sent since the domain never matches. To address this, set a
# `--curves`
-Tells curl to request specific curves to use during SSL session establishment
-according to RFC 8422, 5.1. Multiple algorithms can be provided by separating
-them with `:` (e.g. `X25519:P-521`). The parameter is available identically in
-the OpenSSL `s_client` and `s_server` utilities.
+Set specific curves to use during SSL session establishment according to RFC
+8422, 5.1. Multiple algorithms can be provided by separating them with `:`
+(e.g. `X25519:P-521`). The parameter is available identically in the OpenSSL
+`s_client` and `s_server` utilities.
--curves allows a OpenSSL powered curl to make SSL-connections with exactly
the (EC) curve requested by the client, avoiding nontransparent client/server
# `--data-ascii`
-This is just an alias for --data.
+This option is just an alias for --data.
# `--data-binary`
-This posts data exactly as specified with no extra processing whatsoever.
+Post data exactly as specified with no extra processing whatsoever.
If you start the data with the letter @, the rest should be a filename. Data
is posted in a similar manner as --data does, except that newlines and
# `--data-raw`
-This posts data similarly to --data but without the special
-interpretation of the @ character.
+Post data similarly to --data but without the special interpretation of the @
+character.
# `--data-urlencode`
-This posts data, similar to the other --data options with the exception
-that this performs URL-encoding.
+Post data, similar to the other --data options with the exception that this
+performs URL-encoding.
To be CGI-compliant, the \<data\> part should begin with a *name* followed by
a separator and a content specification. The \<data\> part can be passed to
curl using one of the following syntaxes:
## content
-This makes curl URL-encode the content and pass that on. Just be careful
-so that the content does not contain any = or @ symbols, as that makes
-the syntax match one of the other cases below!
+URL-encode the content and pass that on. Just be careful so that the content
+does not contain any `=` or `@` symbols, as that makes the syntax match one of
+the other cases below!
## =content
-This makes curl URL-encode the content and pass that on. The preceding =
-symbol is not included in the data.
+URL-encode the content and pass that on. The preceding `=` symbol is not
+included in the data.
## name=content
-This makes curl URL-encode the content part and pass that on. Note that
-the name part is expected to be URL-encoded already.
+URL-encode the content part and pass that on. Note that the name part is
+expected to be URL-encoded already.
## @filename
-This makes curl load data from the given file (including any newlines),
-URL-encode that data and pass it on in the POST.
+load data from the given file (including any newlines), URL-encode that data
+and pass it on in the POST.
## name@filename
-This makes curl load data from the given file (including any newlines),
-URL-encode that data and pass it on in the POST. The name part gets an equal
-sign appended, resulting in *name=urlencoded-file-content*. Note that the
-name is expected to be URL-encoded already.
+load data from the given file (including any newlines), URL-encode that data
+and pass it on in the POST. The name part gets an equal sign appended,
+resulting in *name=urlencoded-file-content*. Note that the name is expected to
+be URL-encoded already.
Sends the specified data in a POST request to the HTTP server, in the same way
that a browser does when a user has filled in an HTML form and presses the
-submit button. This makes curl pass the data to the server using the
+submit button. This option makes curl pass the data to the server using the
content-type application/x-www-form-urlencoded. Compare to --form.
--data-raw is almost the same but does not have a special interpretation of
# `--delegation`
-Set LEVEL to tell the server what it is allowed to delegate when it
-comes to user credentials.
+Set LEVEL what curl is allowed to delegate when it comes to user credentials.
## none
Do not allow any delegation.
# `--digest`
-Enables HTTP Digest authentication. This is an authentication scheme that
-prevents the password from being sent over the wire in clear text. Use this in
-combination with the normal --user option to set username and password.
+Enables HTTP Digest authentication. This authentication scheme avoids sending
+the password over the wire in clear text. Use this in combination with the
+normal --user option to set username and password.
# `--disable-eprt`
-Tell curl to disable the use of the EPRT and LPRT commands when doing active
-FTP transfers. Curl normally first attempts to use EPRT before using PORT, but
-with this option, it uses PORT right away. EPRT is an extension to the
-original FTP protocol, and does not work on all servers, but enables more
-functionality in a better way than the traditional PORT command.
+Disable the use of the EPRT and LPRT commands when doing active FTP transfers.
+Curl normally first attempts to use EPRT before using PORT, but with this
+option, it uses PORT right away. EPRT is an extension to the original FTP
+protocol, and does not work on all servers, but enables more functionality in
+a better way than the traditional PORT command.
--eprt can be used to explicitly enable EPRT again and --no-eprt is an alias
for --disable-eprt.
# `--disable-epsv`
-Tell curl to disable the use of the EPSV command when doing passive FTP
-transfers. Curl normally first attempts to use EPSV before PASV, but with this
-option, it does not try EPSV.
+Disable the use of the EPSV command when doing passive FTP transfers. Curl
+normally first attempts to use EPSV before PASV, but with this option, it does
+not try EPSV.
--epsv can be used to explicitly enable EPSV again and --no-epsv is an alias
for --disable-epsv.
# `--disallow-username-in-url`
-This tells curl to exit if passed a URL containing a username. This is probably
-most useful when the URL is being provided at runtime or similar.
+Exit with error if passed a URL containing a username. Probably most useful
+when the URL is being provided at runtime or similar.
# `--dns-interface`
-Tell curl to send outgoing DNS requests through the given interface. This
-option is a counterpart to --interface (which does not affect DNS). The
-supplied string must be an interface name (not an address).
+Send outgoing DNS requests through the given interface. This option is a
+counterpart to --interface (which does not affect DNS). The supplied string
+must be an interface name (not an address).
# `--dns-ipv4-addr`
-Tell curl to bind to a specific IP address when making IPv4 DNS requests, so
-that the DNS requests originate from this address. The argument should be a
-single IPv4 address.
+Bind to a specific IP address when making IPv4 DNS requests, so that the DNS
+requests originate from this address. The argument should be a single IPv4
+address.
# `--dns-ipv6-addr`
-Tell curl to bind to a specific IP address when making IPv6 DNS requests, so
-that the DNS requests originate from this address. The argument should be a
-single IPv6 address.
+Bind to a specific IP address when making IPv6 DNS requests, so that the DNS
+requests originate from this address. The argument should be a single IPv6
+address.
# `--etag-compare`
-This option makes a conditional HTTP request for the specific ETag read
-from the given file by sending a custom If-None-Match header using the
-stored ETag.
+Make a conditional HTTP request for the specific ETag read from the given file
+by sending a custom If-None-Match header using the stored ETag.
For correct results, make sure that the specified file contains only a
single line with the desired ETag. An empty file is parsed as an empty
# `--etag-save`
-This option saves an HTTP ETag to the specified file. An ETag is a
-caching related header, usually returned in a response.
+Save an HTTP ETag to the specified file. An ETag is a caching related header,
+usually returned in a response.
If no ETag is sent by the server, an empty file is created.
Maximum time in seconds that you allow curl to wait for a 100-continue
response when curl emits an Expects: 100-continue header in its request. By
-default curl waits one second. This option accepts decimal values! When
-curl stops waiting, it continues as if the response has been received.
+default curl waits one second. This option accepts decimal values. When curl
+stops waiting, it continues as if a response was received.
-The decimal value needs to provided using a dot (.) as decimal separator - not
-the local version even if it might be using another separator.
+The decimal value needs to provided using a dot (`.`) as decimal separator -
+not the local version even if it might be using another separator.
Return an error on server errors where the HTTP response code is 400 or
greater). In normal cases when an HTTP server fails to deliver a document, it
-returns an HTML document stating so (which often also describes why and
-more). This flag allows curl to output and save that content but also to
-return error 22.
+returns an HTML document stating so (which often also describes why and more).
+This option allows curl to output and save that content but also to return
+error 22.
This is an alternative option to --fail which makes curl fail for the same
circumstances but without saving the content.
Fail fast with no output at all on server errors. This is useful to enable
scripts and users to better deal with failed attempts. In normal cases when an
HTTP server fails to deliver a document, it returns an HTML document stating
-so (which often also describes why and more). This flag prevents curl from
-outputting that and return error 22.
+so (which often also describes why and more). This command line option
+prevents curl from outputting that and return error 22.
This method is not fail-safe and there are occasions where non-successful
response codes slip through, especially when authentication is involved
# `--false-start`
-Tells curl to use false start during the TLS handshake. False start is a mode
-where a TLS client starts sending application data before verifying the
-server's Finished message, thus saving a round trip when performing a full
-handshake.
+Use false start during the TLS handshake. False start is a mode where a TLS
+client starts sending application data before verifying the server's Finished
+message, thus saving a round trip when performing a full handshake.
-This is currently only implemented in the Secure Transport (on iOS 7.0 or
-later, or OS X 10.9 or later) backend.
+This functionality is currently only implemented in the Secure Transport (on
+iOS 7.0 or later, or OS X 10.9 or later) backend.
# `--form-escape`
-Tells curl to pass on names of multipart form fields and files using
-backslash-escaping instead of percent-encoding.
+Pass on names of multipart form fields and files using backslash-escaping
+instead of percent-encoding.
# `--form`
-For HTTP protocol family, this lets curl emulate a filled-in form in which a
-user has pressed the submit button. This causes curl to POST data using the
-Content-Type multipart/form-data according to RFC 2388.
+For the HTTP protocol family, emulate a filled-in form in which a user has
+pressed the submit button. This makes curl POST data using the Content-Type
+multipart/form-data according to RFC 2388.
-For SMTP and IMAP protocols, this is the means to compose a multipart mail
-message to transmit.
+For SMTP and IMAP protocols, this composes a multipart mail message to
+transmit.
This enables uploading of binary files etc. To force the 'content' part to be
a file, prefix the filename with an @ sign. To just get the content part from
while the \< makes a text field and just get the contents for that text field
from a file.
-Tell curl to read content from stdin instead of a file by using - as
-filename. This goes for both @ and \< constructs. When stdin is used, the
-contents is buffered in memory first by curl to determine its size and allow a
-possible resend. Defining a part's data from a named non-regular file (such as
-a named pipe or similar) is not subject to buffering and is instead read at
+Read content from stdin instead of a file by using a single "-" as filename.
+This goes for both @ and \< constructs. When stdin is used, the contents is
+buffered in memory first by curl to determine its size and allow a possible
+resend. Defining a part's data from a named non-regular file (such as a named
+pipe or similar) is not subject to buffering and is instead read at
transmission time; since the full size is unknown before the transfer starts,
such data is sent as chunks by HTTP and rejected by IMAP.
curl -F "story=<hugefile.txt" https://example.com/
-You can also tell curl what Content-Type to use by using 'type=', in a manner
-similar to:
+You can also instruct curl what Content-Type to use by using `type=`, in a
+manner similar to:
curl -F "web=@index.html;type=text/html" example.com
server. The method argument should be one of the following alternatives:
## multicwd
-curl does a single CWD operation for each path part in the given URL. For deep
-hierarchies this means many commands. This is how RFC 1738 says it should
-be done. This is the default but the slowest behavior.
+Do a single CWD operation for each path part in the given URL. For deep
+hierarchies this means many commands. This is how RFC 1738 says it should be
+done. This is the default but the slowest behavior.
## nocwd
-curl does no CWD at all. curl does SIZE, RETR, STOR etc and give a full
-path to the server for all these commands. This is the fastest behavior.
+Do no CWD at all. curl does SIZE, RETR, STOR etc and gives the full path to
+the server for each of these commands. This is the fastest behavior.
## singlecwd
-curl does one CWD with the full target directory and then operates on the file
+Do one CWD with the full target directory and then operate on the file
"normally" (like in the multicwd case). This is somewhat more standards
-compliant than 'nocwd' but without the full penalty of 'multicwd'.
+compliant than `nocwd` but without the full penalty of `multicwd`.
# `--ftp-port`
Reverses the default initiator/listener roles when connecting with FTP. This
-option makes curl use active mode. curl then tells the server to connect back
-to the client's specified address and port, while passive mode asks the server
-to setup an IP address and port for it to connect to. \<address\> should be
-one of:
+option makes curl use active mode. curl then commands the server to connect
+back to the client's specified address and port, while passive mode asks the
+server to setup an IP address and port for it to connect to. \<address\>
+should be one of:
## interface
e.g. **eth0** to specify which interface's IP address you want to use (Unix only)
# `--ftp-pret`
-Tell curl to send a PRET command before PASV (and EPSV). Certain FTP servers,
-mainly drftpd, require this non-standard command for directory listings as
-well as up and downloads in PASV mode.
+Send a PRET command before PASV (and EPSV). Certain FTP servers, mainly
+drftpd, require this non-standard command for directory listings as well as up
+and downloads in PASV mode.
# `--ftp-skip-pasv-ip`
-Tell curl to not use the IP address the server suggests in its response to
-curl's PASV command when curl connects the data connection. Instead curl
-reuses the same IP address it already uses for the control connection.
+Do not use the IP address the server suggests in its response to curl's PASV
+command when curl connects the data connection. Instead curl reuses the same
+IP address it already uses for the control connection.
This option is enabled by default (added in 7.74.0).
# `--globoff`
-This option switches off the "URL globbing parser". When you set this option,
-you can specify URLs that contain the letters {}[] without having curl itself
+Switch off the URL globbing function. When you set this option, you can
+specify URLs that contain the letters {}[] without having curl itself
interpret them. Note that these letters are not normal legal URL contents but
they should be encoded according to the URI standard.
# `--haproxy-protocol`
-Send a HAProxy PROXY protocol v1 header at the beginning of the
-connection. This is used by some load balancers and reverse proxies to
-indicate the client's true IP address and port.
+Send a HAProxy PROXY protocol v1 header at the beginning of the connection.
+This is used by some load balancers and reverse proxies to indicate the
+client's true IP address and port.
This option is primarily useful when sending test requests to a service that
expects this header.
# `--help`
-Usage help. This lists all curl command line options within the given
-**category**.
+Usage help. List all curl command line options within the given **category**.
-If no argument is provided, curl displays only the most important command line
+If no argument is provided, curl displays the most important command line
arguments.
For category **all**, curl displays help for all options.
# `--hsts`
-This option enables HSTS for the transfer. If the filename points to an
-existing HSTS cache file, that is used. After a completed transfer, the cache
-is saved to the filename again if it has been modified.
+Enable HSTS for the transfer. If the filename points to an existing HSTS cache
+file, that is used. After a completed transfer, the cache is saved to the
+filename again if it has been modified.
If curl is told to use HTTP:// for a transfer involving a hostname that exists
in the HSTS cache, it upgrades the transfer to use HTTPS. Each HSTS cache
# `--http0.9`
-Tells curl to be fine with HTTP version 0.9 response.
+Accept an HTTP version 0.9 response.
HTTP/0.9 is a response without headers and therefore you can also connect with
this to non-HTTP servers and still get a response since curl simply
# `--http1.0`
-Tells curl to use HTTP version 1.0 instead of using its internally preferred
-HTTP version.
+Use HTTP version 1.0 instead of using its internally preferred HTTP version.
# `--http1.1`
-Tells curl to use HTTP version 1.1.
+Use HTTP version 1.1. This is the default with HTTP:// URLs.
# `--http2-prior-knowledge`
-Tells curl to issue its non-TLS HTTP requests using HTTP/2 without HTTP/1.1
-Upgrade. It requires prior knowledge that the server supports HTTP/2 straight
-away. HTTPS requests still do HTTP/2 the standard way with negotiated protocol
+Issue a non-TLS HTTP requests using HTTP/2 directly without HTTP/1.1 Upgrade.
+It requires prior knowledge that the server supports HTTP/2 straight away.
+HTTPS requests still do HTTP/2 the standard way with negotiated protocol
version in the TLS handshake.
# `--http2`
-Tells curl to use HTTP version 2.
+Use HTTP/2.
For HTTPS, this means curl negotiates HTTP/2 in the TLS handshake. curl does
this by default.
# `--http3`
-Tells curl to try HTTP/3 to the host in the URL, but fallback to earlier
-HTTP versions if the HTTP/3 connection establishment fails. HTTP/3 is only
-available for HTTPS and not for HTTP URLs.
+Attempt HTTP/3 to the host in the URL, but fallback to earlier HTTP versions
+if the HTTP/3 connection establishment fails. HTTP/3 is only available for
+HTTPS and not for HTTP URLs.
This option allows a user to avoid using the Alt-Svc method of upgrading to
HTTP/3 when you know that the target speaks HTTP/3 on the given host and port.
# `--ipv4`
-This option tells curl to use IPv4 addresses only when resolving hostnames,
-and not for example try IPv6.
+Use IPv4 addresses only when resolving hostnames, and not for example try
+IPv6.
# `--ipv6`
-This option tells curl to use IPv6 addresses only when resolving hostnames,
-and not for example try IPv4.
+Use IPv6 addresses only when resolving hostnames, and not for example try
+IPv4.
# `--keepalive-time`
-This option sets the time a connection needs to remain idle before sending
-keepalive probes and the time between individual keepalive probes. It is
-currently effective on operating systems offering the `TCP_KEEPIDLE` and
-`TCP_KEEPINTVL` socket options (meaning Linux, recent AIX, HP-UX and more).
-Keepalive is used by the TCP stack to detect broken networks on idle
-connections. The number of missed keepalive probes before declaring the
-connection down is OS dependent and is commonly 9 or 10. This option has no
-effect if --no-keepalive is used.
+Set the time a connection needs to remain idle before sending keepalive probes
+and the time between individual keepalive probes. It is currently effective on
+operating systems offering the `TCP_KEEPIDLE` and `TCP_KEEPINTVL` socket
+options (meaning Linux, recent AIX, HP-UX and more). Keepalive is used by the
+TCP stack to detect broken networks on idle connections. The number of missed
+keepalive probes before declaring the connection down is OS dependent and is
+commonly 9 or 10. This option has no effect if --no-keepalive is used.
If unspecified, the option defaults to 60 seconds.
# `--limit-rate`
Specify the maximum transfer rate you want curl to use - for both downloads
-and uploads. This feature is useful if you have a limited pipe and you would like
-your transfer not to use your entire bandwidth. To make it slower than it
+and uploads. This feature is useful if you have a limited pipe and you would
+like your transfer not to use your entire bandwidth. To make it slower than it
otherwise would be.
The given speed is measured in bytes/second, unless a suffix is appended.
# `--list-only`
-(FTP)
-When listing an FTP directory, this switch forces a name-only view. This is
-especially useful if the user wants to machine-parse the contents of an FTP
-directory since the normal directory view does not use a standard look or
-format. When used like this, the option causes an NLST command to be sent to
-the server instead of LIST.
+When listing an FTP directory, force a name-only view. Maybe particularly
+useful if the user wants to machine-parse the contents of an FTP directory
+since the normal directory view does not use a standard look or format. When
+used like this, the option causes an NLST command to be sent to the server
+instead of LIST.
Note: Some FTP servers list only files in their response to NLST; they do not
include sub-directories and symbolic links.
-(SFTP)
When listing an SFTP directory, this switch forces a name-only view, one per
line. This is especially useful if the user wants to machine-parse the
contents of an SFTP directory since the normal directory view provides more
information than just filenames.
-(POP3)
When retrieving a specific email from POP3, this switch forces a LIST command
to be performed instead of RETR. This is particularly useful if the user wants
to see if a specific message-id exists on the server and what size it is.
Like --location, but allows sending the name + password to all hosts that the
site may redirect to. This may or may not introduce a security breach if the
-site redirects you to a site to which you send your authentication info
-(which is clear-text in the case of HTTP Basic authentication).
+site redirects you to a site to which you send your authentication info (which
+is clear-text in the case of HTTP Basic authentication).
# `--max-time`
-Maximum time in seconds that you allow each transfer to take. This is useful
-for preventing your batch jobs from hanging for hours due to slow networks or
-links going down. This option accepts decimal values (added in 7.32.0).
+Set maximum time in seconds that you allow each transfer to take. Prevents
+your batch jobs from hanging for hours due to slow networks or links going
+down. This option accepts decimal values (added in 7.32.0).
If you enable retrying the transfer (--retry) then the maximum time counter is
reset each time the transfer is retried. You can use --retry-max-time to limit
# `--negotiate`
-Enables Negotiate (SPNEGO) authentication.
+Enable Negotiate (SPNEGO) authentication.
This option requires a library built with GSS-API or SSPI support. Use
--version to see if your curl supports GSS-API/SSPI or SPNEGO.
# `--netrc-file`
-This option is similar to --netrc, except that you provide the path (absolute
-or relative) to the netrc file that curl should use. You can only specify one
-netrc file per invocation.
+Set the netrc file to use. Similar to --netrc, except that you also provide
+the path (absolute or relative).
It abides by --netrc-optional if specified.
# `--netrc`
-Makes curl scan the *.netrc* file in the user's home directory for login name
+Make curl scan the *.netrc* file in the user's home directory for login name
and password. This is typically used for FTP on Unix. If used with HTTP, curl
enables user authentication. See *netrc(5)* and *ftp(1)* for details on the
file format. Curl does not complain if that file does not have the right
# `--next`
-Tells curl to use a separate operation for the following URL and associated
-options. This allows you to send several URL requests, each with their own
-specific options, for example, such as different usernames or custom requests
-for each.
+Use a separate operation for the following URL and associated options. This
+allows you to send several URL requests, each with their own specific options,
+for example, such as different usernames or custom requests for each.
--next resets all local options and only global ones have their values survive
over to the operation following the --next instruction. Global options include
# `--ntlm`
-Enables NTLM authentication. The NTLM authentication method was designed by
+Use NTLM authentication. The NTLM authentication method was designed by
Microsoft and is used by IIS web servers. It is a proprietary protocol,
reverse-engineered by clever people and implemented in curl based on their
efforts. This kind of behavior should not be endorsed, you should encourage
# `--output-dir`
-This option specifies the directory in which files should be stored, when
---remote-name or --output are used.
+Specify the directory in which files should be stored, when --remote-name or
+--output are used.
The given output directory is used for all URLs and output options on the
command line, up until the first --next.
Or for Windows:
curl example.com -o nul
+
+Specify the filename as single minus to force the output to stdout, to
+override curl's internal binary output in terminal prevention:
+
+ curl https://example.com/jpeg -o -
# `--path-as-is`
-Tell curl to not handle sequences of /../ or /./ in the given URL
-path. Normally curl squashes or merges them according to standards but with
-this option set you tell it not to do that.
+Do not handle sequences of /../ or /./ in the given URL path. Normally curl
+squashes or merges them according to standards but with this option set you
+tell it not to do that.
# `--pinnedpubkey`
-Tells curl to use the specified public key file (or hashes) to verify the
-peer. This can be a path to a file which contains a single public key in PEM
-or DER format, or any number of base64 encoded sha256 hashes preceded by
-'sha256//' and separated by ';'.
+Use the specified public key file (or hashes) to verify the peer. This can be
+a path to a file which contains a single public key in PEM or DER format, or
+any number of base64 encoded sha256 hashes preceded by 'sha256//' and
+separated by ';'.
When negotiating a TLS or SSL connection, the server sends a certificate
indicating its identity. A public key is extracted from this certificate and
# `--post301`
-Tells curl to respect RFC 7231/6.4.2 and not convert POST requests into GET
-requests when following a 301 redirection. The non-RFC behavior is ubiquitous
-in web browsers, so curl does the conversion by default to maintain
-consistency. However, a server may require a POST to remain a POST after such
-a redirection. This option is meaningful only when using --location.
+Respect RFC 7231/6.4.2 and do not convert POST requests into GET requests when
+following a 301 redirect. The non-RFC behavior is ubiquitous in web browsers,
+so curl does the conversion by default to maintain consistency. However, a
+server may require a POST to remain a POST after such a redirection. This
+option is meaningful only when using --location.
# `--post302`
-Tells curl to respect RFC 7231/6.4.3 and not convert POST requests into GET
-requests when following a 302 redirection. The non-RFC behavior is ubiquitous
-in web browsers, so curl does the conversion by default to maintain
-consistency. However, a server may require a POST to remain a POST after such
-a redirection. This option is meaningful only when using --location.
+Respect RFC 7231/6.4.3 and do not convert POST requests into GET requests when
+following a 302 redirect. The non-RFC behavior is ubiquitous in web browsers,
+so curl does the conversion by default to maintain consistency. However, a
+server may require a POST to remain a POST after such a redirection. This
+option is meaningful only when using --location.
# `--post303`
-Tells curl to violate RFC 7231/6.4.4 and not convert POST requests into GET
-requests when following 303 redirections. A server may require a POST to
-remain a POST after a 303 redirection. This option is meaningful only when
-using --location.
+Violate RFC 7231/6.4.4 and do not convert POST requests into GET requests when
+following 303 redirect. A server may require a POST to remain a POST after a
+303 redirection. This option is meaningful only when using --location.
# `--proto-default`
-Tells curl to use *protocol* for any URL missing a scheme name.
+Use *protocol* for any provided URL missing a scheme.
-An unknown or unsupported protocol causes error
-*CURLE_UNSUPPORTED_PROTOCOL* (1).
+An unknown or unsupported protocol causes error *CURLE_UNSUPPORTED_PROTOCOL*.
This option does not change the default proxy protocol (http).
# `--proto-redir`
-Tells curl to limit what protocols it may use on redirect. Protocols denied by
---proto are not overridden by this option. See --proto for how protocols are
-represented.
+Limit what protocols to allow on redirects. Protocols denied by --proto are
+not overridden by this option. See --proto for how protocols are represented.
Example, allow only HTTP and HTTPS on redirect:
# `--proto`
-Tells curl to limit what protocols it may use for transfers. Protocols are
-evaluated left to right, are comma separated, and are each a protocol name or
-'all', optionally prefixed by zero or more modifiers. Available modifiers are:
+Limit what protocols to allow for transfers. Protocols are evaluated left to
+right, are comma separated, and are each a protocol name or 'all', optionally
+prefixed by zero or more modifiers. Available modifiers are:
## +
Permit this protocol in addition to protocols already permitted (this is
# `--proxy-anyauth`
-Tells curl to pick a suitable authentication method when communicating with
+Automatically pick a suitable authentication method when communicating with
the given HTTP proxy. This might cause an extra request/response round-trip.
# `--proxy-basic`
-Tells curl to use HTTP Basic authentication when communicating with the given
-proxy. Use --basic for enabling HTTP Basic with a remote host. Basic is the
-default authentication method curl uses with proxies.
+Use HTTP Basic authentication when communicating with the given proxy. Use
+--basic for enabling HTTP Basic with a remote host. Basic is the default
+authentication method curl uses with proxies.
# `--proxy-ca-native`
-Tells curl to use the CA store from the native operating system to verify the
-HTTPS proxy. By default, curl uses a CA store provided in a single file or
-directory, but when using this option it interfaces the operating system's own
-vault.
+Use the CA store from the native operating system to verify the HTTPS proxy.
+By default, curl uses a CA store provided in a single file or directory, but
+when using this option it interfaces the operating system's own vault.
This option works for curl on Windows when built to use OpenSSL, wolfSSL
(added in 8.3.0) or GnuTLS (added in 8.5.0). When curl on Windows is built to
# `--proxy-digest`
-Tells curl to use HTTP Digest authentication when communicating with the given
-proxy. Use --digest for enabling HTTP Digest with a remote host.
+Use HTTP Digest authentication when communicating with the given proxy. Use
+--digest for enabling HTTP Digest with a remote host.
# `--proxy-http2`
-Tells curl to try negotiate HTTP version 2 with an HTTPS proxy. The proxy might
-still only offer HTTP/1 and then curl sticks to using that version.
+Negotiate HTTP/2 with an HTTPS proxy. The proxy might still only offer HTTP/1
+and then curl sticks to using that version.
This has no effect for any other kinds of proxies.
# `--proxy-negotiate`
-Tells curl to use HTTP Negotiate (SPNEGO) authentication when communicating
-with the given proxy. Use --negotiate for enabling HTTP Negotiate (SPNEGO)
-with a remote host.
+Use HTTP Negotiate (SPNEGO) authentication when communicating with the given
+proxy. Use --negotiate for enabling HTTP Negotiate (SPNEGO) with a remote
+host.
# `--proxy-ntlm`
-Tells curl to use HTTP NTLM authentication when communicating with the given
-proxy. Use --ntlm for enabling NTLM with a remote host.
+Use HTTP NTLM authentication when communicating with the given proxy. Use
+--ntlm for enabling NTLM with a remote host.
# `--proxy-pinnedpubkey`
-Tells curl to use the specified public key file (or hashes) to verify the
-proxy. This can be a path to a file which contains a single public key in PEM
-or DER format, or any number of base64 encoded sha256 hashes preceded by
-'sha256//' and separated by ';'.
+Use the specified public key file (or hashes) to verify the proxy. This can be
+a path to a file which contains a single public key in PEM or DER format, or
+any number of base64 encoded sha256 hashes preceded by 'sha256//' and
+separated by ';'.
When negotiating a TLS or SSL connection, the server sends a certificate
indicating its identity. A public key is extracted from this certificate and
# `--proxy-service-name`
-This option allows you to change the service name for proxy negotiation.
+Set the service name for proxy negotiation.
# `--proxy-tls13-ciphers`
-Specifies which cipher suites to use in the connection to your HTTPS proxy
-when it negotiates TLS 1.3. The list of ciphers suites must specify valid
-ciphers. Read up on TLS 1.3 cipher suite details on this URL:
+Specify which cipher suites to use in the connection to your HTTPS proxy when
+it negotiates TLS 1.3. The list of ciphers suites must specify valid ciphers.
+Read up on TLS 1.3 cipher suite details on this URL:
https://curl.se/docs/ssl-ciphers.html
# `--referer`
-Sends the referrer URL in the HTTP request. This can also be set with the
+Set the referrer URL in the HTTP request. This can also be set with the
--header flag of course. When used with --location you can append `;auto`" to
the --referer URL to make curl automatically set the previous URL when it
follows a Location: header. The `;auto` string can be used alone, even if you
# `--remote-header-name`
-This option tells the --remote-name option to use the server-specified
-Content-Disposition filename instead of extracting a filename from the URL. If
-the server-provided filename contains a path, that is stripped off before the
-filename is used.
+Tell the --remote-name option to use the server-specified Content-Disposition
+filename instead of extracting a filename from the URL. If the server-provided
+filename contains a path, that is stripped off before the filename is used.
The file is saved in the current directory, or in the directory specified with
--output-dir.
# `--remote-name-all`
-This option changes the default action for all given URLs to be dealt with as
-if --remote-name were used for each one. If you want to disable that for a
+Change the default action for all given URLs to be dealt with as if
+--remote-name were used for each one. If you want to disable that for a
specific URL after --remote-name-all has been used, you must use "-o -" or
--no-remote-name.
# `--remove-on-error`
-When curl returns an error when told to save output in a local file, this
-option removes that saved file before exiting. This prevents curl from
-leaving a partial file in the case of an error during transfer.
+Remove output file if an error occurs. If curl returns an error when told to
+save output in a local file. This prevents curl from leaving a partial file in
+the case of an error during transfer.
If the output is not a regular file, this option has no effect.
# `--request-target`
-Tells curl to use an alternative target (path) instead of using the path as
-provided in the URL. Particularly useful when wanting to issue HTTP requests
-without leading slash or other data that does not follow the regular URL
-pattern, like "OPTIONS *".
+Use an alternative target (path) instead of using the path as provided in the
+URL. Particularly useful when wanting to issue HTTP requests without leading
+slash or other data that does not follow the regular URL pattern, like
+"OPTIONS *".
curl passes on the verbatim string you give it its the request without any
filter or other safe guards. That includes white space and control characters.
you need several entries if you want to provide address for the same host but
different ports.
-By specifying '*' as host you can tell curl to resolve any host and specific
+By specifying `*` as host you can tell curl to resolve any host and specific
port pair to the specified address. Wildcard is resolved last so any --resolve
with a specific host and port is used first.
# `--service-name`
-This option allows you to change the service name for SPNEGO.
+Set the service name for SPNEGO.
# `--socks5-basic`
-Tells curl to use username/password authentication when connecting to a SOCKS5
-proxy. The username/password authentication is enabled by default. Use
---socks5-gssapi to force GSS-API authentication to SOCKS5 proxies.
+Use username/password authentication when connecting to a SOCKS5 proxy. The
+username/password authentication is enabled by default. Use --socks5-gssapi to
+force GSS-API authentication to SOCKS5 proxies.
# `--socks5-gssapi-service`
-The default service name for a socks server is **rcmd/server-fqdn**. This option
-allows you to change it.
+Set the service name for a socks server. Default is **rcmd/server-fqdn**.
# `--socks5-gssapi`
-Tells curl to use GSS-API authentication when connecting to a SOCKS5 proxy.
-The GSS-API authentication is enabled by default (if curl is compiled with
-GSS-API support). Use --socks5-basic to force username/password authentication
-to SOCKS5 proxies.
+Use GSS-API authentication when connecting to a SOCKS5 proxy. The GSS-API
+authentication is enabled by default (if curl is compiled with GSS-API
+support). Use --socks5-basic to force username/password authentication to
+SOCKS5 proxies.
# `--ssl-allow-beast`
-This option tells curl to not work around a security flaw in the SSL3 and
-TLS1.0 protocols known as BEAST. If this option is not used, the SSL layer may
-use workarounds known to cause interoperability problems with some older SSL
-implementations.
+Do not work around a security flaw in the SSL3 and TLS1.0 protocols known as
+BEAST. If this option is not used, the SSL layer may use workarounds known to
+cause interoperability problems with some older SSL implementations.
**WARNING**: this option loosens the SSL security, and by using this flag you
ask for exactly that.
# `--ssl-auto-client-cert`
-(Schannel) Tell libcurl to automatically locate and use a client certificate
-for authentication, when requested by the server. Since the server can request
-any certificate that supports client authentication in the OS certificate
-store it could be a privacy violation and unexpected.
+(Schannel) Automatically locate and use a client certificate for
+authentication, when requested by the server. Since the server can request any
+certificate that supports client authentication in the OS certificate store it
+could be a privacy violation and unexpected.
# `--ssl-no-revoke`
-(Schannel) This option tells curl to disable certificate revocation checks.
-WARNING: this option loosens the SSL security, and by using this flag you ask
-for exactly that.
+(Schannel) Disable certificate revocation checks. WARNING: this option loosens
+the SSL security, and by using this flag you ask for exactly that.
# `--ssl-revoke-best-effort`
-(Schannel) This option tells curl to ignore certificate revocation checks when
-they failed due to missing/offline distribution points for the revocation check
-lists.
+(Schannel) Ignore certificate revocation checks when they failed due to
+missing/offline distribution points for the revocation check lists.
# `--styled-output`
-Enables the automatic use of bold font styles when writing HTTP headers to the
+Enable automatic use of bold font styles when writing HTTP headers to the
terminal. Use --no-styled-output to switch them off.
Styled output requires a terminal that supports bold fonts. This feature is
# `--tftp-no-options`
-Tells curl not to send TFTP options requests.
-
-This option improves interop with some legacy servers that do not acknowledge
-or properly implement TFTP options. When this option is used --tftp-blksize is
-ignored.
+Do not to send TFTP options requests. This improves interop with some legacy
+servers that do not acknowledge or properly implement TFTP options. When this
+option is used --tftp-blksize is ignored.
# `--tlsv1`
-Tells curl to use at least TLS version 1.x when negotiating with a remote TLS
-server. That means TLS version 1.0 or higher
+Use at least TLS version 1.x when negotiating with a remote TLS server. That
+means TLS version 1.0 or higher
# `--trace-ascii`
-Enables a full trace dump of all incoming and outgoing data, including
-descriptive information, to the given output file. Use `-` as filename to have
+Save a full trace dump of all incoming and outgoing data, including
+descriptive information, in the given output file. Use `-` as filename to have
the output sent to stdout.
This is similar to --trace, but leaves out the hex part and only shows the
# `--trace`
-Enables a full trace dump of all incoming and outgoing data, including
-descriptive information, to the given output file. Use "-" as filename to have
+Save a full trace dump of all incoming and outgoing data, including
+descriptive information, in the given output file. Use "-" as filename to have
the output sent to stdout. Use "%" as filename to have the output sent to
stderr.
# `--upload-file`
-This transfers the specified local file to the remote URL.
+Upload the specified local file to the remote URL.
If there is no file part in the specified URL, curl appends the local file
name to the end of the URL before the operation starts. You must use a
# `--url-query`
-This option adds a piece of data, usually a name + value pair, to the end of
-the URL query part. The syntax is identical to that used for --data-urlencode
-with one extension:
+Add a piece of data, usually a name + value pair, to the end of the URL query
+part. The syntax is identical to that used for --data-urlencode with one
+extension:
If the argument starts with a '+' (plus), the rest of the string is provided
as-is unencoded.
# `--url`
-Specify a URL to fetch. This option is mostly handy when you want to specify
-URL(s) in a config file.
+Specify a URL to fetch.
If the given URL is missing a scheme name (such as `http://` or `ftp://` etc)
then curl makes a guess based on the host. If the outermost subdomain name
# `--use-ascii`
-Enable ASCII transfer. For FTP, this can also be enforced by using a URL that
-ends with `;type=A`. This option causes data sent to stdout to be in text mode
-for win32 systems.
+Enable ASCII transfer mode. For FTP, this can also be enforced by using a URL
+that ends with `;type=A`. This option causes data sent to stdout to be in text
+mode for win32 systems.
# `--xattr`
-When saving output to a file, this option tells curl to store certain file
-metadata in extended file attributes. Currently, the URL is stored in the
-`xdg.origin.url` attribute and, for HTTP, the content type is stored in the
-`mime_type` attribute. If the file system does not support extended
-attributes, a warning is issued.
+When saving output to a file, tell curl to store file metadata in extended
+file attributes. Currently, the URL is stored in the `xdg.origin.url`
+attribute and, for HTTP, the content type is stored in the `mime_type`
+attribute. If the file system does not support extended attributes, a warning
+is issued.
if(($l == 1) && ($line =~ /^---/)) {
# first line is a meta-data divider, skip to the next one
$metadata = 1;
- print STDERR "skip meta-data in $f\n";
next;
}
elsif($metadata) {