curl

copy a URL

                    _   _ ___  _
Project         ___| | | |  _\| |
               / __| | | | |_)| |
              | (__| |_| | _ <| |___
               \___|\___/|_|\_\_____|
curl [options] [URL...]

copy data from or to a server, using one DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP without user interaction.

Proxy support, user authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer resume ….

As with all the documentaiton on this site, this is an tersified version
THE complete documentation is at curl.se/docs/manpage.html

URL

The URL syntax is protocol-dependent.

Specify multiple URLs or parts of URLs by writing part sets within braces as in:
http://site.{one,two,three}.com

Get sequences of alphanumeric series by using [] as in:
ftp://ftp.numericals.com/file[1-100].txt
ftp://me:secret@ftp.myserver.com/file[001-100].txt
(with leading zeros)
ftp://ftp.letters.com/file[a-z].txt

Nested sequences are not supported, use several ones next to each other:
http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html

Multiple URLs will be fetched in a sequentially .
A step counter for the ranges to get every Nth number or letter:
http://www.numericals.com/file[1-100:10].txt
http://www.letters.com/file[a-z:2].txt

Without protocol:// prefix, curl infers the protocol, default to HTTP other protocols based on often-used host name prefixes. For example, host names starting with ftp use FTP://.

Progress

Options

boolean options are enabled with --option and disabled with --no-option.

If an option is given multiple time the last occurance is used unless otherwise noted.

[--url]URL
     required
not -u that's for user
URL to fetch.
Can be in config.
May be used any number of times.
--local-port num[-num] local port numbers
-o
--output file| -
   [--create-dirs]
Output to file or - for stdout.

#n in the file is replaced with the URL being fetched.

Example:

curl http://{one,two}.site.com --output  "file_#1.txt"
or use several variables :
curl http://{site,host}.host[1-5].com -o "#1_#2"
Use as many times as the number of URLs
-O
--remote-name
Output to a local file (instead of stdout) named with the remote name in the current working directory.
-#
--progress-bar
Progress as a bar:
#######30%

Without --progress-bar Redirecting stdout via > or -o displays progress:

% Total    % Received % Xferd  Average Speed   Time    Time     Time    Current
                                Dload  Upload  Total   Spent    Left     Speed
100 32715  100 32715    0     0   184k      0 --:--:-- --:--:-- --:--:--  190k
--silent
-s
no progress meter or error messages.

    --show-error

-S


Show errors

--stderr {file| - } Redirect stderr to file or stdout with -
-g
--globoff
{}[] are not treated as meta characters in URLs and must be encoded
{ = %7B,   } = %7D,   [ = %5B,   ] = %5d

HTTP specific options: Receiving

-D file
--dump-header file
Save headers to file.
HTTP/1.1 200 OK
Date: Mon, 10 Aug 2020 20:08:22 GMT
Server: Apache
Last-Modified: Sun, 12 Jul 2020 18:58:37 GMT
Accept-Ranges: bytes
Content-Length: 2975
Content-Type: text/html
Cookies from the headers could be read in a second invocation using the --cookie.
--cookie-jar is a better way to store cookies.

FTP: server response lines are considered "headers"

-i
--include
Include headers with output.
HTTP/1.1 200 OK^M
Cache-Control: no-cache, no-store^M
Pragma: no-cache^M
Content-Type: text/html; charset=utf-8^M
…
Strict-Transport-Security: max-age=2592000^M
^M
<!DOCTYPE html>
<head>
<title>
…
-I
--head
Fetch the headers only.
HTTP/1.1 200 OK
Cache-Control: no-cache, no-store
Pragma: no-cache
Content-Length: 32715
Content-Type: text/html; charset=utf-8
Expires: -1
Date: Fri, 06 Jan 2023 17:38:20 GMT
…
Reporting-Endpoints: wildapricot-csp-uel='https://csp.uel.wildapricot.com/report'
Strict-Transport-Security: max-age=2592000
-L
--location
Reissue the request on the new location if the page has moved (indicated with a Location: header and a 3XX response code).
With --include header or --head, headers from all requested pages will be shown.
When authentication is used, curl only sends credentials to the initial host.
See --location-trusted
Limit the amount of redirects to follow by using --max-redirs
If the HTTP response was 301, 302, or 303 a redirect and which is not a GET (i.e. POST or PUT), executes the next request with a GET .
If the response code was any other 3xx code, curl will re-send the following request using the same method.
--location-trusted Like --location sends the name + password to hosts redirect to. A security issue if the site redirects to a site to which sends authentication info (which is plaintext in the case of HTTP Basic authentication).
    --max-redirs num maximum redirection-followings allowed with --location .
default: 50 . Don't use -1 to make it limitless.
    --post301do not convert POST requests to GETs when following a 301 redirection. only with --location
    --post302do not convert POST requests to GETs when following a 302 redirection. only with --location
--compressed Request a compressed response and save uncompressed
--tr-encoding " … uncompress the data while receiving it.
--raw do not decode content or transfer encodings

HTTP specific options: sending

--form name=
{ content
| @ | <} file | - }
-F
Emulate a form submission by issuing POST data using
 Content-Type multipart/form-data.
@ attaches a file upload (ex: datafile=@input.dat).
< gets the content from a file, which makes a text field.
For stdin use -.
curl -F "web=@index.html;type=text/html" url.com
or
curl -F "name=daniel;type=text/foo" url.com
Change the name of a file with filename=f[:… ]
curl -F "file=@localfile;filename=nameinpost" url.com

Can be used multiple times.

--form-string name=string Similar to --form except leading @ and < characters, and the ;type= string have no special meaning.
-d
--data @file|-|data
--data-acsii
Sends data in a POST request, in the same way that a browser does when a form is submitted, using the
  content-type application/x-www-form-urlencoded
To URLencode the value of a form field use --data-urlencode.
If used more than once, data will be merged with a separating &.
For example: using -d name=daniel -d skill=lousy
to generate the post 'name=daniel&skill=lousy'.
Use @file to read the data from file or - from stdin.

The contents of the file must be URL-encoded.
Multiple files can also be specified.

--data-urlencode data posts data, similar to the --data and URL-encoding.
To be CGI-compliant, data should begin with a name followed by a separator and a content specification.
The data part can be passed to curl using :
[=]content URL-encode the content and pass that on. The = is not included in the data.
name=content URL-encode the content part and pass that on. name must be previously URLencoded
@filename load data from file (including newlines), URL-encode that data and pass it on in the POST.
name@file load data from file (including newlines), URL-encode that data and pass it on in the POST.
name has an equal sign appended, resulting in name=urlencoded-file-content.
name is expected to be URL-encoded.
--data-binary data posts data with no processing
When @ prefixes a filename Data is posted as with --data-ascii, except newlines are preserved and no conversions are done.
Used several times, they are append as described in--data.
-G
--get
data specified with --data or --data-binary to be used in a HTTP GET request .
The URL is seperated by a ? followed by the data.
With -I, the POST data will be appended to the URL with a HEAD request.
Only the first occurance is honored.
-H
--header header
Extra header to use with get.
Using a internal header replaces it.
Remove an internal header by giving a replacement without content, example: -H "Host:".
To send a header with no-value, terminate it with a semicolon, example: -H "X-Custom-Header;".
See --user-agent and --referer .
Can be used multiple times to add/replace/remove multiple headers.
-A
--user-agent agent string
User-Agent string to send to the HTTP server.
? Some CGIs fail if this field isn't set to "Mozilla/4.0". ?
To encode blanks in the string, use quotes .
Can be set with -H,--header
-e
--referer URL
Sends the "Referer Page" information to the HTTP server. Can be set with --header .
With --location, append ;auto to the --referer URL to set the previous URL when it follows a Location: header.

;auto can be used alone, without an initial --referer.

-J
--remote-header-name
use the server-specified Content-Disposition filename instead of extracting a filename from the URL.

HTTP specific options: cookie handling

-b
--cookie file | name=value[;…]
Without =filename to use to read cookies and record incoming cookies in the cookie‑jar.
Useful with --location .
The format is plain HTTP headers

To store cookies, use --cookie-jar or --dump-header

-c
--cookie-jar file
Save cookies from file as well as cookies received from server after a completed operation.
If no cookies are known, no file will be written.
Use - to have cookies written to stdout.

If the cookie jar can't be created or written to the operation doesn't fail or report an error.
Use -v to output a warning

-j
--junk-session-c
discard "session cookies" as if a new session is started.

Authentication

--basicBasic authentication default.
-u
--user user:password
name:password for authentication.
Overrides --netrc and --netrc-optional.
Force user name and password by specifying a single colon with : --user :.
   --digestHTTP Digest authentication. Prevents the password from being sent in clear text. Use with --user
-U
--proxy-user user:password
user name and password to use for proxy .
--anyauth determine authentication method and use the most secure one.
Not recommended for uploads from stdin.
--ntlm (HTTP) Enables NTLM authentication.

To enable NTLM for proxy authentication use --proxy-ntlm.

-E
--cert certificate[:password]
(SSL) use the client certificate file when getting a file with HTTPS, FTPS or another SSL-based protocol, PEM format.
A "certificate" file that is the private key and the private certificate concatenated! See --cert and --key to specify them independently.

With NSS SSL library this tells the nickname of the certificate to use within the NSS database defined by the $SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded.
To use a file from the current directory, precede it with ./

--delegation level What should the server do regarding delegating when it comes to user credentials. with GSS/kerberos.
none Don't
policy Delegates if and only if the OK-AS-DELEGATE flag is set in the Kerberos service ticket, which is a matter of realm policy.
always
--ciphers list of ciphers see SSL ciphers
NSS ciphers entry at NSSCipherSuite
--engine name Select the OpenSSL crypto engine . Use list to output a list of build-time supported engines.
Build-time engines:
  <none>
--cert-type type Type of provided certificate : PEM, DER and ENG. defauult PEM
--cacert certificateFile use certificate file bundle to verify the peer. Contains multiple CA certificates, in PEM format.
Overides the default file and $CURL_CA_BUNDLE the path to a CA cert bundle.

The windows version looks for curl-ca-bundle.crt in the same directory as curl.exe, or in the Current Working Directory, or in any folder along PATH.

With the NSS SSL library tells curl the nickname of the CA certificate to use within the NSS database defined by the $SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded.

--capath directory[:…] Directory to verify the peer.

PEM format.
With OpenSSL, the directory must have been processed with c_rehash supplied with OpenSSL.
Makes SSL-connections more efficient than using --cacert file containing many entries.

--crlfile file (HTTPS/FTPS) Certificate Revocation List in a file using PEM format with a certificates that are revoked.

FTP

-l
--list-only
directory listing nameonly.
-I
--head
include size and modification time
-T
--upload-file [file]
Transfers the local file to the remote URL using the same name if there is no file in the URL.
Use a trailing / on the last directory to use the same file name
Use a -T for each URL on the command line.

Each -T + URL pair specifies what to upload where.
Supports "globbing" of the file argument, permitting the uploading of multiple files to a single URL by using the same URL globbing style supported in the URL, For Example:
curl -T "{file1,file2}" http://www.uploadtothissite.com
or
curl -T "img[1-1000].png" ftp://ftp.picmania.com/upload/

Use - for stdin.
Use . for stdin in non-blocking mode to allow reading server output while stdin is being uploaded.

With HTTP(S) PUT will be used.

-R
--remote-time
retains file timestamp
-C
--continue-at {-|offset}
Continue a file transfer
With - output/input files determine continuation point.
With offset continue at that point.
-a
--append
with upload, create or append
--crlf Convert LF to CRLF in upload. ( as in linux to MS Windows)
--use-asciii
-Bsic
ASCII translating line terminators as needed.
Also appending ;type=A to the URL.
causes data sent to stdout to be in text mode for win32 systems.
--ftp-create-dirs create missing directories.
--ftp-method [method] method to reach a file in a subdirectory.
multicwd a CWD command for each path part.
singlecwd one CWD with the full directory
nocwd give a full path.
To save the file in a different directory change current working directory first.
Use as many times as the number of URLs/.
-K
--config config file
Default curl.rc
Parameters must be specified on the same line, separated by whitespace, colon, equals sign or any combination (preferred equals ).
If a parameter contains whitespace, enclose it quotes.
Escape sequences are : \\, \", \t, \n, \r and \v . '#' is a comment.
One option per line
filename as '-' to read from stdin.
To specify a URL in the config file use --url :
     url = "http://curl.haxx.se/docs/"

Long option names can be given without double dashes unless -q is used, checks for a config file

1) "home dir": $CURL_HOME and then $HOME
It uses getpwuid() on UNIX-like systems (which returns the home dir given the current user ).
On Windows, checks for APPDATA , or the %USERPROFILE%\Application Data'

2) On windows, if there is no _curlrc file in the home dir, checks for one in the same dir the curl executable is .
On UNIX-like systems, it will to load .curlrc from the determined home dir.

 # --- Example file --# this is a comment
       url = "curl.haxx.se"
       output = "curlhere.html"
       user-agent = "superagent/1.0"

       # and fetch another URL too
       url = "curl.haxx.se/docs/manpage.html"
       -O
       referer = "http://nowhereatall.com/" 
Can be used multiple times to load multiple config files.
-q as the first parameter on the command line, won't read curl.rc.

FTP port negotitation

passive(host says what port to use for data transer) or active (client, i.e. curl decides)
--ftp-pasv Use passive mode. Default, overrides --ftp-port
Tries EPSV first and then PASV, unless--disable-epsv is used.
Passive mode: server sets up an IP address and port
--ftp-skip-pasv-ip Ignore IP address the server offers in response to PASV.
--disable-eprt
--no-eprt

--eprt

disable/enable EPRT and LPRT address and port specifing commands.
Normally first uses EPRT(extended), then LPRT(long) finally PORT,
Disabling EPRT only changes the active behavior. to switch to passive mode to not use --ftp-port or force it with --ftp-pasv.
--disable-epsv
--no-epsv

--epsv

disable/enable EPSV when doing passive transfers.
Disabling EPSV only changes the passive behavior. To switch to active mode use --ftp-port.
-P
--ftp-port address
Use active mode. Server to connect to the client's address and port.
address :
interface Ex: "eth0"
IP address Ex: "192.168.10.1"
host name Ex: "my.host.domain"
- same IP address used for the control connection

Disable the use of PORT with --ftp-pasv.
Disable the attempt to use EPRT instead of PORT by using --disable-eprt. EPRT is really PORT.

Append :[S_port][-E_port] to the address, to specify the port range. A single number will fail if that port is not available.

--ftp-pret send a PRET before PASV (and EPSV).
Some servers ( drftpd ) require command for directory and transfers in PASV mode.
--ftp-alternative-to-user command | If authentication with user:pass fails, send command.
-n
--netrc
Use ~/.netrc (_netrc on MSWindows) for name and password.
Does not support macros.
--netrc-optional  
--netrc-file fid path to .netrc
Overrides --netrc and --netrc-optional
--ftp-account [acctname] Response to "account data". Sent using ACCT .
Security for login , commands and trnasfer
-k
--insecure
(SSL) explicitly allows "insecure" SSL connections and transfers.
SSL connections are secure by using the CA certificate bundle fail unless--insecure is used.
See http://curl.haxx.se/docs/sslcerts.html
--ftp-ssl-ccc authenticate with SSL then CCC (Clear Command Channel)
Allows NAT routers to follow the transaction.
--ftp-ssl-control SSL/TLS for the login, clear for transfer.
--key keyfile (SSL/SSH) Private key file name.
--key-type typePrivate key file type. type --key provided private key is. DER, PEM, and ENG .. default PEM
--[no-]sessionid (SSL) Disable use of SSL session-ID caching. Use --sessionid to enforce session-ID caching.
--pass phrase (SSL/SSH) Passphrase for the private key.
--pubkey file (SSH) Public key file.
--random-file file (SSL) file containing random data to seed the random engine for SSL connections. See --egd-file
--egd-file file (SSL) path name to Entropy Gathering Daemon socket used to seed the random engine
--hostpubmd5 md5 SFTP/SCP. Connection is refused if md5sums don't match.
--krb level Enable Kerberos. The level one of clear, safe, confidential, or private.
Use --version to see if kerberos4 or GSSAPI (GSS-Negotiate) is supported.
--ftp-ssl-ccc-mode [active|passive]
  • passive will not initiate shutdown, but waits for the server to do it, and
    will not reply to the shutdown from the server.
  • active initiates the shutdown and waits for a reply from the server.
  • --tftp-blksize bytes Set TFTP BLKSIZE (must be >512), default 512
    --mail-from address (SMTP) single address that the mail should get sent from.
    --mail-rcpt address (SMTP) single address that the mail should get sent to.
    Use multiple times to specify many recipients.
    --limit-rate bps maximum average transfer rate.
    Append k or K for kilobytes, m or M megabytes, g or G
    Might use higher transfer speeds in short bursts
    --speed-limit take precedence and might superess the rate-limiting, to help keeping the speed-limit logic working.
    -y
    --speed-time seconds
    slower than speed-limit bps for speed-time get aborted.
    If speed-time is used, default speed-limit is 1.
    -Y
    --speed-limit seconds
    slower than bps for speed-time get aborted.
    Default 30 set using -y
    --max-filesize bytes A file larger than bytes causes exit code 63.
    -N
    --[no-]buffer
    Disables buffering of the output stream.
    use --buffer to enforce the buffering.
    --keepalive-time seconds seconds a connection is idle before sending keepalive probes and the time between individual keepalive probes.
    --[no-]keepalive Disables/enables keepalive TCP connection, default enabled
    --keepalive enforces keepalive.
    --tcp-nodelay Turn on TCP_NODELAY . See curl_easy_setopt(3)
    --interface name interface name, IP address or host name.
    Example curl --interface eth0:1 http://www.netscape.com/
    --noproxy host[,… hosts which do not use a proxy,
    Use * to disable the proxy.
    Each host in this list is matched as either a domain which contains the hostname, or the hostname itself. For example, local.com would match local.com,E local.com:80, and www.local.com, but not www.notlocal.com.
    --proxy-anyauth pick a suitable authentication method with proxy.
    --proxy-basic
    use HTTP Basic authentication with proxy.
    Use --basic for enabling HTTP Basic with a remote host.
    Basic is the default authentication with proxies.
    --proxy-digest use HTTP Digest authentication with the given proxy.
    Use --digest for enabling HTTP Digest with a remote host.
    --proxy-negotiate use HTTP Negotiate authentication when communicating with the given proxy. Use --negotiate for enabling HTTP Negotiate with a remote host.
    --proxy-ntlm use HTTP NTLM authentication when communicating with the given proxy. Use --ntlm for enabling NTLM with a remote host.
    --proxy1.0 proxyhost[:port] Uses CONNECT through the proxy with HTTP 1.0 protocol.
    Default port is 1080.
    --proto protocol[, …] Use protocol for initial retrieval.
    Evaluated left to right, comma separated, and are each a protocol name or all, optionally prefixed by modifiers. :
    + Add protocol to those already permitted (default if no modifier is used).
    - Remove this protocol from the list of protocols permitted.
    = Permit only this protocol
    For example:
    --proto -ftps uses the default protocols, disables ftps
    --proto -all,https,+http only enables ('+' default) https and http
    --proto =http,https only enables http and https('+' default)

    May be used multiple times.

    --proto-redir protocol[,…] Use protocols after a redirect. See --proto
    -Q
    --quote command
    (FTP/SFTP) Send an arbitrary command to the remote FTP or SFTP server.
    BEFORE the transfer takes place.
    To have commands be sent after changed the working directory, just before the transfer command(s), prefix the command with '+' (only FTP).
    To have commands take place after a successful transfer, prefix them with a dash '-'.

    May specify any number of commands.
    If the server returns failure for one command, the entire operation will be aborted.

    Used multiple times. WIth FTP prefix the command with an asterisk (*) to continue even if the command fail

    File names may be quoted shell-style to embed spaces or special characters.

    SFTP interprets commands before sending them supported SFTP quote commands: .

    chgrp groupID# file sets the group ID of file to the group ID#
    chmod mmm file modifies the file mode bits OOO
    chown user file sets the owner to the user ID specified
    ln source_file target_file ln and symlink create a symbolic link at the target_file location pointing to the source_file location.
    mkdir directory_name creates the directory
    pwd returns the absolute pathname of the current working directory.
    rename target renames the file or directory
    rm file removes the file
    rmdir directory removes the directory if empty.
    -r
    --range range
    (HTTP/FTP/SFTP/FILE) Retrieve a byte range (ie a partial file)
    Examples:
           0-499   first 500 bytes 
           500-999  second 500 bytes 
           -500   last 500 bytes 
           9500-   bytes from offset 9500 and forward  
    Request HTTP to reply with a multipart response.
    0-0,-1 first and last byte only 500-700,600-799 300 bytes from offset 500 100-199,500-599 two separate 100-byte ranges
    FTP and SFTP range downloads only support the start-stop syntax (optionally with one of the numbers omitted).

    FTP use depends on the extended FTP command SIZE.

    --remote-name-all changes the default action for all given URLs to be dealt with as if --remote-name were used for each one.
    disable that for a specific URL after --remote-nameall has been used, use --no-remote-name.
    --resolve host:port:address address instead of DNS or /etc/hosts.
    port used for the specific protocol .
    Several entries provide address for the same host but different ports.
    Use multiple times to add host names.
    --retry n For a transient error retry. default: 0 no retries.
    Transient error : a timeout, FTP 4xx or HTTP 5xx response code.
    Waits one second and for each retry doubles the waiting time up to 10 minutes.
    --retry-delay seconds disables this. See --retry-max-time to limit the total time allowed for retries.
    --retry-delay seconds With --retry sleep before retry when a transient error occurs.
    zero :use default backoff time.
    --retry-max-time secondsRetries until timer expires.
    To limit a single request's maximum time, use --max-time.
    The retry timer is reset before the first transfer attempt.
    Do NOT Set to zero to not timeout retries.
    --connect-timeout seconds Maximum time to establish connection.
    -m
    --max-time seconds
    Maximum time for entire operation. See also --connect-timeout .
    --ssl (FTP, POP3, IMAP, SMTP) Try to use SSL/TLS for the connection. Reverts to a non-secure connection if the server doesn't support SSL/TLS. See also --ftp-ssl-control and --ssl-reqd for different levels of encryption required. was --ftp-ssl
    -1
    --tlsv1
    (SSL) use TLS version 1 when negotiating with a remote TLS server.
    -2
    --sslv2
    (SSL) use SSL version 2
    -3
    --sslv3
    (SSL) use SSL version 3
    --ssl-reqd (FTP, POP3, IMAP, SMTP) Require SSL/TLS for the connection. Terminates the connection if the server doesn't support SSL/TLS.
    formerly --ftp-ssl-reqd
    --socks5-gssapi-service servicename default rcmd/server-fqdn. Examples: --socks5 proxy-name --socks5-gssapi-service sockd use sockd/proxy-name --socks5 proxy-name --socks5-gssapiservice sockd/real-name would use sockd/real-name for cases where the proxy-name does not match the principal name.
    --socks5-gssapi-nec As part of the gssapi negotiation a protection mode is negotiated. RFC 1961 says in section 4.3/4.4 it should be protected, but the NEC reference implementation does not. --socks5-gssapi-nec allows the unprotected exchange of the protection mode negotiation.
    -t
    --telnet-option OPT=val
    Pass options to the telnet protocol. TTYPE=term terminal type.
    XDISPLOC=X display X display location.
    NEW_ENV=var,val environment variable.
    --tlsauthtype authtype Set TLS authentication type. Currently, the only supported option is "SRP", for TLS-SRP (RFC 5054). If --tlsuser and --tlspassword are specified but --tlsauthtype is not, then this option defaults to "SRP". (Added in 7.21.4)
    --tlsuser user Set username for use with the TLS authentication method specified with --tlsauthtype. Requires that --tlspassword also be set.
    --tlspassword password Set password for use with the TLS authentication method specified with --tlsauthtype. Requires that --tlsuser also be set.
    --environment (RISC OS ONLY) Sets a range of environment variables, using the names -w supports, to allow easier extraction of useful information after having run curl.
    --trace file | - trace dump of all incoming and outgoing data, overrides previous uses of or --traceascii.
    --trace-ascii file | - trace dump all incoming and outgoing data, including descriptive information.
    similar to --trace only shows the ASCII part of the dump.
    overrides previous uses of --verbose or --trace.
    --trace-time Prepends a time stamp to trace or verbose line
    15:37:48.247521 *   Trying 144.126.151.19:80...
    15:37:48.259940 * Connected to anti-hacker-alliance.com (144.126.151.19) port 80 (#0)
    15:37:48.260028 > GET / HTTP/1.1
    15:37:48.260028 > Host: anti-hacker-alliance.com
    15:37:48.260028 > User-Agent: curl/7.79.1
    15:37:48.260028 > Accept: */*
    15:37:48.260028 > 
    15:37:48.268780 * Mark bundle as not supporting multiuse
    15:37:48.268887 < HTTP/1.1 301 Moved Permanently
    15:37:48.268941 < Date: Fri, 13 Jan 2023 20:37:48 GMT
    15:37:48.269068 < Server: Apache/2.4.38 (Debian)
    15:37:48.269102 < Location: https://anti-hacker-alliance.com/
    15:37:48.269130 < Content-Length: 21
    15:37:48.269158 < Content-Type: text/html; charset=iso-8859-1
    15:37:48.269186 < 
    15:37:48.269253 * Connection #0 to host anti-hacker-alliance.com left intact
    -v
    --verbose
    prefix line with:
    > "header data" sent,
    < "header data" received by curl that is hidden in normal cases, and
    * additional info provided by curl.
    For only HTTP headers use --include
    For more details, use --trace or --trace-ascii
    overrides previous uses of --trace or --trace-ascii .
    Use --silent to make curl quiet.
    * TCP_NODELAY set
    * Connected to real-world-systems.com (209.95.59.175) port 443 (#0)
    * ALPN, offering h2
    * ALPN, offering http/1.1
    * successfully set certificate verify locations:
    *   CAfile: /etc/ssl/cert.pem
      CApath: none
    * TLSv1.2 (OUT), TLS handshake, Client hello (1): } [236 bytes data]
    * TLSv1.2 (IN), TLS handshake, Server hello (2): { [108 bytes data]
    * TLSv1.2 (IN), TLS handshake, Certificate (11):  { [4489 bytes data]
    * TLSv1.2 (IN), TLS handshake, Server key exchange (12): { [300 bytes data]
    * TLSv1.2 (IN), TLS handshake, Server finished (14): { [4 bytes data]
    * TLSv1.2 (OUT), TLS handshake, Client key exchange (16): } [37 bytes data]
    * TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1): } [1 bytes data]
    * TLSv1.2 (OUT), TLS handshake, Finished (20): } [16 bytes data]
    * TLSv1.2 (IN), TLS change cipher, Change cipher spec (1): { [1 bytes data]
    * TLSv1.2 (IN), TLS handshake, Finished (20): { [16 bytes data]
    * SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384
    * ALPN, server accepted to use http/1.1
    * Server certificate:
    *  subject: CN=real-world-systems.com
    *  start date: Apr  3 00:00:00 2020 GMT
    *  expire date: Apr 10 23:59:59 2021 GMT
    *  subjectAltName: host "real-world-systems.com" 
             matched cert's "real-world-systems.com"
    *  issuer: C=GB; ST=Greater Manchester; L=Salford; O=Sectigo Limited; 
               CN=Sectigo RSA Domain Validation Secure Server CA
    *  SSL certificate verify ok.
    > GET / HTTP/1.1
    > Host: real-world-systems.com
    > User-Agent: curl/7.64.1
    > Accept: */*
    > 
    < HTTP/1.1 200 OK
    < Date: Mon, 10 Aug 2020 20:39:07 GMT
    < Server: Apache
    < Last-Modified: Sun, 12 Jul 2020 18:58:37 GMT
    < Accept-Ranges: bytes
    < Content-Length: 2975
    < Content-Type: text/html
    < 
    { [2975 bytes data]
    100  2975  100  2975    0     0   9596      0 --:--:-- --:--:-- --:--:--  9596
    * Connection #0 to host real-world-systems.com left intact
    * Closing connection 0
    
    
    
    
    
    -w
    --write-out format
    Display statistics on stdout .
    The format string may contain plain text and variables which will be substituted by their value.
    To read from a file use @filename.
    To read from stdin use @-.

    variables are prefixed with % and enclosed within {} as in %{variable_name}
    For multiple variables on the command line use " around all. To output a % use %%.
    Output a newline using \n, a carriage return with \r and a tab space with \t.
    The %-symbol is a special symbol in the win32-environment, occurrences of % must be escaped.

    %{url_effective}URL that was fetched last. meaningful if curl is following location: headers.
    %{http_code} numerical response code in the last retrieved HTTP(S) or FTP(s) transfer.
    %{http_connect} numerical code in the last response (from a proxy) to a curl CONNECT request.
    %{time_namelookup} seconds, until the name resolving was completed.
    %{time_connect} seconds, until the TCP connect to the remote host (or proxy) was completed.
    %{time_appconnect}seconds, until the SSL/SSH/etc connect/handshake to the remote host was completed.
    %{time_pretransfer}seconds, from the start until the file transfer was just about to begin. includes pre-transfer commands and negotiations specific to the particular protocol(s) involved.
    %{time_redirect} seconds, for all redirection steps including name lookup, connect, pretransfer and transfer before the final transaction was started. time_redirect shows the complete execution time for multiple redirections. (Added in time_starttransfer The time, in seconds, it took from the start until the first byte was just about to be transferred. This includes time_pretransfer and also the time the server needed to calculate the result.
    %{time_total} seconds, the full operation lasted, with subsecond resolution.
    %{size_download} bytes that were downloaded.
    %{size_upload}
    %{size_header}
    %{size_request} bytes that were sent in the HTTP request.
    %{speed_download average}
    %{speed_upload average} content_type of the requested document,
    %{num_connects} Number of new connects
    %{num_redirects} Number of redirects that were followed
    %{redirect_url} a HTTP request without-L to follow redirects, actual URL a redirect result
    %{ftp_entry_path}initial path ended up in when logging on to the remote FTP server.
    %{ssl_verify_result} of the SSL peer certificate verification, 0 means the verification was successful.
    For example:
    curl --write-out "%{time_redirect} %{url_effective} " wepc.us
    …
    </body>
    </html> 
    0.000000 http://wepc.us/ 
    
    > cat curl-stats  # format
    Name: %{time_namelookup} \n
    Connect:%{time_connect} 
    AppConn: %{time_appconnect} Prertransfer: %{time_pretransfer} redirect: %{time_redirect} \n
    Total:%{time_total} \n
    
    > curl --write-out @curl-stats  wepc.us |\
    tail -n4|sed 's/0\.//g ; s/[[:digit:]]\{3\} / /g'  # show in milliseconds on mac os
      % Total    % Received % Xferd  Average Speed   Time    Time     Time   Current
                                     Dload  Upload   Total   Spent    Left   Speed
    100  1716  100  1716    0     0  12080      0 --:--:-- --:--:-- --:--:-- 12617
    
    Name: 007 
    Connect:070 AppConn: 000 Prertransfer: 070 redirect: 000 
    Total:142 
    
     curl --write-out @curl-stats  https://$RWS/docs/XBeePaper.pdf |\
    tail -n40|sed 's/0\.//g ; s/[[:digit:]]\{3\} / /g'
      % Total    % Received % Xferd  Average Speed   Time    Time     Time    Current
                                     Dload  Upload   Total   Spent    Left    Speed
    100 1420k  100 1420k    0     0   465k      0  0:00:03  0:00:03 --:--:--  467k
    
    %%EOF
    Name: 011 
    Connect:071 AppConn: 154 Prertransfer: 154 redirect: 000 
    Total:3.048 
    
    -x
    --proxy i>[://][user@password]proxyhost[:port]
    Overrides environment variables, use --proxy "" to override it.
    All operations performed via an HTTP proxy will be converted to HTTP and some operations are not available.
    Default port is 1080.
    protocol includes: http, socks4, socks4a, socks5 or socks5h Default HTTP.

    Use tunneing through the proxy, with --proxytunnel

      -p
    --proxytunnel
    With --proxy, non-HTTP protocols to to tunnel through the proxy
    The tunnel is made with the HTTP proxy CONNECT request and requires that the proxy allows direct connect to the remote port number curl wants to tunnel through to.
    -X
    --request command
    (HTTP) A request method to use instead of the default GET, for example PUT and DELETE.
    WebDAV offers PROPFIND, COPY, MOVE and more.
    (FTP) command replaces LIST.
    --xattr Store metadata in extened file attributes.
    URL is stored in xdg.origin.url attribute .
    HTTP content type is stored in mime_type attribute.
    -z
    --time-cond
            [-]date expression
    Request file modified later than date expression or
    before that time. date expression can be various formats if it doesn't match any internal ones, it uses time from a given file name !
    See curl_getdate for date expression details.
    Start the date expression with a dash (-) to make it request for a document that is older than the given date/time,
    default is a document that is newer than the specified date/time.
    --libcurl file Append to command line, and get a libcurl-using source code written to the file that does the equivalent of what your command-line operation does!

    Does not support -F and the sending of multipart formposts,

    -h
    --help
    -M
    --manual
    Manual. Display the huge help text.
    -0
    --http1.0
    (HTTP) issue its requests using HTTP 1.0
    -4
    --ipv4
    resolve names to IPv4 addresses only.
    -6
    --ipv6
    resolve names to IPv6 addresses only. default statistics.
    -f
    --fail
    Fail silently (no output at all) on server errors, to enable scripts to deal with failed attempts.
    Return error 22.
    Some nonsuccessful response codes will be output for example when authentication is involved (response codes 401 and 407).
    -V
    --version
    Includes the full version of curl, libcurl and other 3rd party libraries linked with the executable.
    Features include(see full documentation for additional details)>: libz decompression over HTTP
    SPNEGO Negotiate authentication
    IDN international domain names.
    TLS-SRP SRP Secure Remote Password

    As of OS x Montery 09/16/22:
    curl 7.79.1 (x86_64-apple-darwin21.0) 
    libcurl/7.79.1 (SecureTransport) LibreSSL/3.3.6 zlib/1.2.11 nghttp2/1.45.1
    Release-Date: 2021-09-22
    Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps 
               mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp 
    Features: alt-svc AsynchDNS GSS-API HSTS HTTP2 HTTPS-proxy IPv6 Kerberos 
               Largefile libz MultiSSL NTLM NTLM_WB SPNEGO SSL UnixSockets 

    Proxy Servers

    HTTP and SOCKS proxy servers, with optional authentication. No support for FTP proxy servers howeve r can also use both HTTP and SOCKS proxies to transfer files to and from FTP servers. Get an ftp file using an HTTP proxy named my-proxy that uses port 888: curl -x my-proxy:888 ftp://ftp.leachsite.com/README Get a file from a HTTP server that requires user and password, using the same proxy as above: curl -u user:passwd -x my-proxy:888 http://www.get.this/

    Some proxies require special authentication. Specify by using -U as above: curl -U user:passwd -x my-proxy:888 http://www.get.this/

    A comma-separated list of which do not use the proxy can be specified as: curl --noproxy localhost,get.this -x my-proxy:888 http://www.get.this/

    Most FTP proxy servers are set up to appear as a normal FTP server from the client's perspective, with special commands to select the remote FTP server. curl supports the -u, -Q and --ftp-account options that can be used to set up transfers through many FTP proxies. For example, a file can be uploaded to a remote FTP server using a Blue Coat FTP proxy with the options:

    
      curl -u "Remote-FTP-Username@remote.ftp.server Proxy-Username:Remote-Pass" \
      --ftp-account Proxy-Password --upload-file local-file \
      ftp://my-ftp.proxy.server:21/remote/upload/path/
    
    
    proxy string may be specified with a protocol://.

    Dedfault is HTTP.
    The supported proxy protocol prefixes are :

    socks4:// --socks4
    socks4a:// --socks4a
    socks5:// --socks5
    socks5h:// --socks5-hostname

    ENVIRONMENT

    Environment variables, UPPER or lower case, (lower case has precedence) ( $http_proxy must be lower case).
    Has the same effect as using --proxy .
    $NO_PROXY comma-separated list of hosts that shouldn't go through any proxy. If set to a asterisk *, matches all hosts.

    FILES

    config file, --config : ~/.curlrc

    EXIT CODES

    
        1   Unsupported protocol.
        2   Failed to initialize. 
        3   URL malformed, syntax error.
        4   A feature or option was not enabled at buildtime. 
        5   Couldn't resolve proxy. 
        6   Couldn't resolve host. 
        7   Failed to connect to host. 
        8   FTP unknown server reply.
        9   FTP access denied. denied login or denied access to the resource or directory , 
                        often a change to a directory that doesn't exist . 
        11   FTP unknown PASS reply.
        13   FTP unknown PASV reply.
        14   FTP unknown 227 format reply.
        15   FTP can't resolve host received in the 227-line. 
        17   FTP couldn't set binary.
        18   Partial file.
        19   FTP couldn't download/access the file, RETR (or similar) command failed. 
        21   FTP  quote command returned error 
        22   HTTP page not retrieved. url was not found or returned another error with the HTTP error code 
                  of 400 or above. only with --fail. 
        23   Write error to a local filesystem or similar. 
        25   FTP couldn't STOR file.
        26   Read error.
        27   Out of memory.
        28   Operation timeout.
        30   FTP PORT failed.  try doing a transfer using PASV instead! 
        31   FTP couldn't use REST used for resumed FTP transfers. 
        33   HTTP range error.
        34   HTTP post error. Internal post-request generation error. 
        35   SSL connect error.
        36   FTP bad download resume.
        37   FILE couldn't read file.Permissions? 
        38   LDAP cannot bind.
        39   LDAP search failed. 
        41   LDAP Function not found. 
        42   Aborted by callback. An application told curl to abort . 
        43   Internal error.
        45   Interface error.
        47   Too many redirects
        48   Unknown option specified to libcurl. 
        49   Malformed telnet option. 
        51   peer's SSL certificate or SSH MD5 fingerprint was not OK. 
        52   server didn't reply/ 
        53   SSL crypto engine not found. 
        54   Cannot set SSL crypto engine as default. 
        55   Failed sending network data. 
        56   Failure in receiving network data. 
        58   Problem with the local certificate. 
        59   Couldn't use specified SSL cipher. 
        60   Peer certificate cannot be authenticated with known CA certificates. 
        61   Unrecognized transfer encoding. 
        62   Invalid LDAP URL. 
        63   Maximum file size exceeded. 
        64   Requested FTP SSL level failed. 
        65   Sending the data requires a rewind that failed. 
        66   Failed to initialise SSL Engine. 
        67   user name, password, etc error on login 
        68   File not found on TFTP server. 
        69   Permission problem on TFTP server. 
        70   Out of disk space on TFTP server. 
        71   Illegal TFTP operation. 
        72   Unknown TFTP transfer ID. 
        73   File already exists (TFTP). 
        74   No such user (TFTP). 
        75   Character conversion failed. 
        76   Character conversion functions required. 
        77   Problem with reading the SSL CA cert (path? access rights?). 
        78   resource referenced in the URL does not exist. 
        79   unspecified error during SSH session. 
        80   Failed to shut down the SSL connection. 
        82   Could not load CertRL file, missing or wrong format 
        83   Issuer check failed 
        84   FTP PRET failed 
        85   RTSP: mismatch of CSeq numbers 
        86   RTSP: mismatch of Session Identifiers 
        87   unable to parse FTP file list 
        88   FTP chunk callback reporte error 
    

    SEE

    ftp and wget

    LATEST VERSION at :http://curl.haxx.se

    examples

    main page from AOL's web-server: curl http://www.netscape.com/

    Get the README file (it's big) from funet's ftp-server: curl ftp://ftp.funet.fi/README --output funet.fi.README.txt

    web page using port 8000: curl http://www.weirdserver.com:8000/ example only there's nothing called wierdserver.com

    Get a list of a directory of an FTP site: curl ftp://cool.haxx.se/

    Get the matches of curl from a dictionary: curl dict://dict.org/m:curl

    Fetch two documents at once: curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/

    Get a file from an FTPS server: curl ftps://files.are.secure.com/secrets.txt
    or use the more appropriate FTPS way: curl --ftp-ssl ftp://files.are.secure.com/secrets.txt

    Get a file from an SSH server using SFTP: curl -u username sftp://shell.example.com/etc/issue

    Get a file from an SSH server using SCP using a private key to authenticate: curl -u username: --key ~/.ssh/id_dsa --pubkey ~/.ssh/id_dsa.pub \ scp://shell.example.com/~/personal.txt

    Get the main page from an IPv6 web server: curl -g "http://[2001:1890:1112:1::20]/"

    DOWNLOAD TO A FILE Get a web page and store in a local file: curl --remote-name thatpage.html http://www.aol.com/

    Get a web page and store in a local file, make the local file get the name of the remote document (if no file name part is specified in the URL, this will fail): curl -O http://www.netscape.com/index.html

    Fetch two files and store them with their remote names: curl -O www.haxx.se/index.html -O curl.haxx.se/download.html

    USING PASSWORDS

    FTP

    curl ftp://name:passwd@machine.domain:port/full/path/to/file
    or use -u :
    curl -u name:passwd ftp://machine.domain:port/full/path/to/filei

    FTPS

    For SSL options use protocolFTPS or FTP and --ftp-ssl .

    SFTP / SCP

    ALlows specifing a private key instead of a password.
    The key may itself be protected by a password that is unrelated to the login password.
    If you provide a private key file you must also provide a public key file.

    HTTP

    user and password :
    curl http://name:passwd@machine.domain/full/path/to/file
    or use -u:
    curl -u name:passwd http://machine.domain/full/path/to/file? HTTP offers many different methods of authentication and curl supports several: Basic, Digest, NTLM and Negotiate. Without telling which method to use, curl defaults to Basic. You can also ask curl to pick the most secure ones out of the ones that the server accepts for the given URL, by using --anyauth. Since HTTP URLs don't support user and password, you can't use that style when using Curl via a proxy. You _must_ use the -u fetch during such circumstances.

    HTTPS

    Used with private certificates. See the manual for your FTP proxy to determine the form it expects to set up transfers, and curl's -v option to see exactly what curl is sending.

    RANGES

    Use -r to get parts of a document.

    Get the first 100 bytes of a document: curl -r 0-99 http://www.get.this/

    Get the last 500 bytes of a document: curle> -r -500 http://www.get.this/

    Curl also supports simple ranges for FTP files as well. Then you can only specify start and stop position. Get the first 100 bytes of a document using FTP: curl -r 0-99 ftp://www.get.this/README

    UPLOADING

    FTP / FTPS / SFTP / SCP

    Upload all data on stdin : curl -T - ftp://ftp.upload.com/myfile

    Upload data from a specified file, login with user and password: curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile

    Upload a file to the remote site: curl -T uploadfile -u user:passwd ftp://ftp.upload.com/

    Upload a file and to appended it to the remote file: curl -T localfile -a ftp://ftp.upload.com/remotefile

    ftp upload through a proxy,: curl --proxytunnel -x proxy:port -T localfile ftp.upload.com

    HTTP

    Upload all data on stdin to a specified http site: -T - http://www.upload.com/myfile

    the http server must have been configured to accept PUT before this can be done successfully.

    For other ways to do http data upload, see the POST section below.

    Verbose / Debug

    the -v to get verbose fetching. : curl -v ftp://ftp.upload.com/

    more details and information use --trace or --trace-ascii with a file name to log to, : curl --trace trace.txt www.haxx.se

    Detailed information

    Different protocols provide different ways of getting detailed information about specific files/documents.
    To show detailed information about a single file, use --head which displays all available info on a single file for HTTP and FTP. The HTTP information is a lot more extensive.

    For HTTP, get the header information (as with --head ) before the data use --include.
    --dump-header fidwhen getting files from both FTP and HTTP, and it will then store the headers in the specified file.

    Store the HTTP headers in a separate file (headers.txt in the example): curl --dump-header headers.txt curl.haxx.se

    POST (HTTP)

    using the -d data . The post data must be urlencoded.

    Post a simple "name" and "phone" guestbook. curl -d "name=Rafael%20Sagula&phone=3320780" \ http://www.where.com/guest.cgi

    Example #1:
    Extract the input tags in the form (see perl program formfind.pl on the curl site ).

    If there's a "normal" post, use -d to post. in the format : variable1=data1&variable2=data2&...

    The 'variable' names are the names set with "name=" in the <input> tags, and the data is the contents to fill for the inputs. The data must be properly URL encoded, replace space with + and encode special characers letters with %XX where XX is the hexadecimal representation of the letter's ASCII code.

    Example: page located at formpost.com/testpost/

     <form action="post.cgi" method="post">
        <input name=user >
        <input name=pass type=password >
        <input name=id type=hidden value="blablabla">
        <input name=ding value="submit">
        </form> 

    To post to this: curl --data "user=me&pass=12345&id=myid&ding=submit" |http://formpost.com/testpost/post.cgi

    --data uses the application/x-www-form-urlencoded mime-type, generally understood by CGI's and similar, curl also supports the more capable multipart/form-data type which supports file uploads, etc.. -F accepts parameters like -F "name=contents". To read from a file, use @filename as contents. When specifying a file content type append ;type=mime type to the file name.
    To post the contents of several files in one field. For example, the field name 'coolfiles' is used to send three files, with different content types using : curl -F "coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html" \ http://www.post.com/postit.cgi

    If the content-type is not specified, curl will try to guess from the file extension (it only knows a few), or use the previously specified type (from an earlier file if several files are specified in a list) or else it will use the default type 'application/octet-stream'.

    Emulate a fill-in form with -F. To fill three fields in a form. One field is a file name which to post, one field is your name and one field is a file description. We want to post the file we have written named "cooltext.txt". To let curl do the posting of this data instead of your favourite browser, you have to read the HTML source of the form page and find the names of the input fields. In our example, the input field names are 'file', 'yourname' and 'filedescription'. curl -F "file=@cooltext.txt" -F "yourname=Daniel" \
    -F "filedescription=Cool text file with cool text inside" \
    http://www.post.com/postit.cgi

    To send two files in one post :

  • Send multiple files in a single "field" with a single field name: curl -F "pictures=@dog.gif,cat.gif"
  • Send two fields with two field names: curl -F "docpicture=@dog.gif" -F "catpicture=@cat.gif" To send a field value literally without interpreting a leading '@', '<', or an embedded ';type=', use --form-string >

    Referrer

    A HTTP request can include which address referred to page.
    curl -e www.coolsite.com http://www.showme.com/
    The referer: [sic] field is a full url.

    User Agent

    A Http request can report the "browser" that generated the request.
    curl -a 'mozilla/3.0 (win95; i)' http://www.nationsbank.com/

    Other common strings:

      'Mozilla/3.04 (win95; u)'  netscape version 3 for windows 95
      'Mozilla/2.02 (os/2; u)'   netscape version 2 for os/2
      'Mozilla/4.04 [en] (x11; u; aix 4.2; nav)'      NS for AIX
      'Mozilla/4.05 [en] (x11; u; linux 2.0.32 i586)'   NS for Linux 
      'Mozilla/4.0 (compatible; msie 4.01; windows 95)'  mSIE for W95 
      'Konqueror/1.0'       kde file manager desktop client
      'Lynx/2.7.1 libwww-fm/2.14' lynx command line browser 
    

    cookies

    Keep state information at the client's side. The server sets cookies by sending a response line in the headers that like set-cookie: data where the data contains a set of name=value pairs (separated by semicolons ';' like "name1=value1; name2=value2;").
    The server can specify the path the "cookie" should be used for (by specifying "path=value"), when the cookie expires ("expire=date"), for what domain to use it ("domain=NAME") and if it should be used on secure connections only ("secure").

    If you've received a page from a server that contains a header like:

    set-cookie: sessionid=boo123; path="/foo"; 

    means the server wants that first pair passed on when we get anything in a path beginning with "/foo". Example, get a page that wants my name passed in a cookie: curl -b "name=Daniel" www.sillypage.com

    to use previously received cookies in following sessions. store them in a file in a manner similar to: curl --dump-header headers www.example.com
    ... then in another connect to that (or another) site, use the cookies from the 'headers' file like: curl -b headers www.example.com

    saving headers to a file is error-prone and not the preferred
    save the incoming cookies using the well-known netscape cookie format : curl -c cookies.txt www.example.com

    -b enables "cookie awareness" and with -L to follow a location: used in combination with cookies). If a site sends cookies and a location, use a non-existing file to trigger the cookie awareness for example:
    curl -L -b empty.txt www.example.com

    The file to read cookies from must be formatted using plain HTTP headers OR as netscape's cookie file. In the above command, the header is stored with the cookies received from www.example.com. curl will send to the server the stored cookies which match the request as it follows the location. The file "empty.txt" may be a nonexistent file.

    to both read and write cookies from a "netscape cookie file", set both -b and -c to use the same file: curl -b cookies.txt -c cookies.txt www.example.com

    Progress Meter

     % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
    100 32715  100 32715    0     0   184k      0 --:--:-- --:--:-- --:--:--  190k
    
    
      %       - of the whole transfer
      Total     - size of the expected transfer
      Received   - downloaded bytes
      Xferd     - uploaded bytes
      Time Total  - expected 
      Time Current - since started
      Curr.Speed  - average speed over the last 5 seconds 
    
    Alternately: displays a # for every 1,000 characters received.

    Speed Limits

    -y and -Y abort transfers if the transfer speed is below the lowest limit for a specified time.

    To abort the download if the speed is slower than 3000 bytes per second for 1 minute, use: curl -Y 3000 -y 60 www.far-away-site.com

    Used with the overall time limit, the operation must be completed within 30 minutes(1800 seconds): curl -m 1800 -Y 3000 -y 60 www.far-away-site.com

    Restrict Transfer speed to no faster than 10 kilobytes per second: curl --limit-rate 10K www.far-away-site.com
    or
    curl --limit-rate 10240 www.far-away-site.com

    Or uploading than 1 megabyte per second: curl -T upload --limit-rate 1M ftp://uploadRelease.com

    The transfer rate is regulated when using --limit-rate on a per-second basis, causeing transfer speed to be lower than the specified.

    Config File

    ~/.curlrc (or _curlrc on MSwin )

    command line switches or long options without dashes. separated by spaces, = or :.
    # begins a comment Enclose a parameter with spaces with quotes ("), escape a quote as \".
    Options and arguments on the same line must be on the same line.
    Example, set default time out and proxy in a config file:

     # 30 minute timeout:
     -m 1800
     # proxy for all accesses:
     proxy = proxy.our.domain.com:8080 
    White spaces ARE significant at the end of lines, ignored at the beginning.
    Display a local help page when invoked without URL with a config file similar to:
    # default url to get
              url = "http://help.with.curl.com/curlhelp.html"

    example: echo "user = user:passwd" | curl -K - http://that.secret.site.com

    Extra headers

    When using curl in your programs, you may end up needing to pass on custom headers when getting a web page. Use -H .

    Example, send the header "X-you-and-me: yes" to the server when getting a page: curl -H "X-you-and-me: yes" www.love.com

    Useful in case to send a different text in a header than normal. The -H header replaces the header normally send. If you replace an internal header with an empty one, you prevent that header from being sent. To prevent the Host header from being used: curl -H "Host:" www.server.com

    FTP and PATH names

    When getting files with the ftp:// URL, the path is relative the login directory . To get README.txt from your home directory at your ftp site, use:
    curl ftp://user:passwd@my.site.com/README
    To get the README file from the root directory of that site, specify the absolute file name:
    curl ftp://user:passwd@my.site.com//README

    SFTP and SCP and PATH NAMES

    The path name given is the absolute name on the server. To access a file relative to the remote user's home directory, prefix the file with /~/ , such as:
    curl -u $USER sftp://home.example.com/~/.bashrc

    FTP and firewalls

    FTP requires one of the parties to open a second connection as soon as data is about to get transfered.

    The default issues PASV which causes the server to open another port and await another connection performed by the client. This is good if the client is behind a firewall that doesn't allow incoming connections.

    curl ftp.download.com
    If the server is behind a firewall that doesn't allow connections on ports other than 21 (or doesn't support PASV), use PORT IP number and port.

    Use -P select the a specific inferface. Default address can also be used:

    curl -P - ftp.download.com
    Download with PORT but use the IP address of 'le0' interface (not on MSwindows):
    curl -P le0 ftp.download.com
    Download with PORT and use 192.168.0.10 as the client IP address :
    curl -P 192.168.0.10 ftp.download.com

    Network Interface

    Using a specified port for the interface: curl --interface eth0:1 http://www.netscape.com/

    or

    curl --interface 192.168.1.10 http://www.netscape.com/

    HTTPS

    Example:
    curl https://www.secure-site.com
    Using your personal certificates to get/post files from sites that require valid certificates, use PEM-format (not PKCS#12 ). Example on how to automatically retrieve a document using a certificate with a personal password: curl -E /path/to/cert.pem:password https://secure.site.com/

    Omitting the password will prompt for the password.

    Resuming file transfers

    Continue downloading a document: curl -C - -o file ftp://ftp.server.com/path/file

    Continue uploading a document(*1): curl -C - -T file ftp://ftp.server.com/path/file

    Continue downloading a document from a web server(*2): curl -C - -o file http://www.server.com/ (*1) = ftp server must support SIZE. (*2) = web server must support at least HTTP/1.1.

    TIME CONDITIONS

    HTTP allows a client to specify a time condition for the document it requests. It is If-Modified-Since or If-Unmodified-Since.i
    Use --time-cond flag.
    download only if the remote file is newer than a local copy:
    curl --time-cond local.html http://remote.server.com/remote.html

    Download a file only if the local file is newer than the remote by prepending the date string with a -, as in:
    curl --time-cond -local.html http://remote.server.com/remote.html

    specify a "free text" date as condition. Tell curl to only download the file if it was updated since January 12, 2012:
    curl --time-cond "Jan 12 2012" http://remote.server.com/remote.html

    Curl will then accept a wide range of date formats. You always make the date check the other way around by prepending it with a dash '-'.

    DICT

    try

    
        curl dict://dict.org/m:curl
     220 dict.dict.org dictd 1.12.1/rf on Linux 4.19.0-10-amd64  <143990903.1860.1664736598@dict.dict.org>
    250 ok
    152 18 matches found
    gcide "cul"
    gcide "Cur"
    gcide "Churl"
    gcide "Curle"
    gcide "Curly"
    gcide "Burl"
    gcide "Furl"
    gcide "Gurl"
    gcide "Hurl"
    gcide "Nurl"
    gcide "Purl"
    gcide "Carl"
    gcide "Cull"
    gcide "Curb"
    gcide "Curd"
    gcide "Cure"
    gcide "Curr"
    gcide "Curt"
    .
    250 ok [d/m/c = 0/18/10699; 0.000r 0.000u 0.000s]
    221 bye [d/m/c = 0/0/0; 0.000r 0.000u 0.000s]
    
        curl dict://dict.org/d:heisenbug:jargon
        curl dict://dict.org/d:daniel:web1913
    
    Aliases for m are match and find.
    for d are define and lookup.
    For example,
        curl dict://dict.org/find:curl
    Commands that break the URL description of the RFC (but not the DICT protocol) are
     curl dict://dict.org/show:db
        curl dict://dict.org/show:strat 
    Authentication is missing (not required by the RFC)

    LDAP

    With the OpenLDAP library, curl can use ldap:// support.L

    LDAP query is complex is not an easy task. advice you to dig up the syntax description for that elsewhere. Two places that might suit you are: docs.oracle.com/cd/E19957-01/817-6707/index.html

    RFC 2255, "The LDAP URL Format" http://curl.haxx.se/rfc/rfc2255.txt Get all people from local LDAP server that has a certain sub-domain in their email address: curl -B "ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se"

    If I want the same info in HTML format, I can get it by not using the -B (enforce ASCII) flag.

    ENVIRONMENT VARIABLES for proxies

    General proxy: ALL_PROXY

    For protocol-specific proxies: $http_proxy, $HTTPS_PROXY, $FTP_PROXY

    hosts within domains, in a list seperated by commas, ( '*' matches all hosts) will not use proxies: NO_PROXY

    -x and --proxy override the environment variables.

    NETRC

    .netrc contains name and password. Unix programs won't read this file unless it is only readable by user, curl will!

    Use --netrc and --netrc-optional will be used for all protocols where authentication is required. A very simple .netrc could look something like: machine curl.haxx.se login iamdaniel password mysecret

    Custom Output

    To better allow script programmers to know about the progress , --write-out to speify what information from the previous transfer to extract. To display the amount of bytes downloaded together with some text and an ending newline: curl -w 'We downloaded %{size_download} bytes\n' www.download.com

    Kerberos FTP Transfer

    Must be built in. First, get the krb-ticket like with the kinit/kauth tool. Then use curl in way similar to: curl --krb private ftp://krb4site.com -u username:fakepwd

    There's no use for a password on the -u , With a blank makes curl ask for one and you already entered the real password to kinit/kauth.

    Telnet

    curl telnet://remote.server.tld

    Enter the data to pass to the server on stdin. The result will be sent to stdout or to the file specifed by --output.

    Use -N/--no-buffer to disable buffered output.

    Pass options to the telnet protocol negotiation, by using -t use a vt100 terminal, like: curl -tTTYPE=vt100 telnet://remote.server.tld

    Other options -t include: - XDISPLOC=X display Sets the X display location. - NEW_ENV=var,val Sets an environment variable.

    PERSISTENT CONNECTIONS

    Specifying multiple files on a single command line will make curl transfer all of them, one after the other in the specified order. libcurl will attempt to use persistent connections for the transfers so that the second transfer to the same host can use the same connection that was already initiated and was left open in the previous transfer. This greatly decreases connection time for all but the first transfer and it makes a far better use of the network. Note that curl cannot use persistent connections for transfers that are used in subsequence curl invokes. Try to stuff as many URLs as possible on the same command line if they are using the same host, as that'll make the transfers faster. If you use a http proxy for file transfers, practically all transfers will be persistent. MULTIPLE TRANSFERS WITH A SINGLE COMMAND LINE As is mentioned above, you can download multiple files with one command line by simply adding more URLs. If you want those to get saved to a local file instead of just printed to stdout, you need to add one save option for each URL you specify. Note that this also goes for the -O (but not --remote-name-all). For example: get two files and use -O for the first and a custom file name for the second: curl -O http://url.com/file.txt ftp://ftp.com/moo.exe -o moo.jpg

    You can also upload multiple files in a similar fashion: curl -T local1 ftp://ftp.com/moo.exe -T local2 ftp://ftp.com/moo2.txt


    Example with verbose:
    curl -v https://ruuvi.slack.com/files/U0AGHBZFU/F7FK1B2L9/ruuvitag-full.zip
    *  Trying 52.84.32.203...
    * TCP_NODELAY set
    * Connected to ruuvi.slack.com (52.84.32.203) port 443 (#0)
    * ALPN, offering h2
    * ALPN, offering http/1.1
    * Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
    * successfully set certificate verify locations:
    *  CAfile: /etc/ssl/cert.pem
     CApath: none
    * TLSv1.2 (OUT), TLS handshake, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Server hello (2):
    * TLSv1.2 (IN), TLS handshake, Certificate (11):
    * TLSv1.2 (IN), TLS handshake, Server key exchange (12):
    * TLSv1.2 (IN), TLS handshake, Server finished (14):
    * TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
    * TLSv1.2 (OUT), TLS change cipher, Client hello (1):
    * TLSv1.2 (OUT), TLS handshake, Finished (20):
    * TLSv1.2 (IN), TLS change cipher, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Finished (20):
    * SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
    * ALPN, server accepted to use h2
    * Server certificate:
    * subject: C=US; ST=California; L=San Francisco; O=Slack Technologies, Inc.; CN=*.slack.com
    * start date: Feb 1 00:00:00 2017 GMT
    * expire date: Feb 1 23:59:59 2019 GMT
    * subjectAltName: host "ruuvi.slack.com" matched cert's "*.slack.com"
    * issuer: C=US; O=GeoTrust Inc.; CN=GeoTrust SSL CA - G3
    * SSL certificate verify ok.
    * Using HTTP2, server supports multi-use
    * Connection state changed (HTTP/2 confirmed)
    * Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
    * Using Stream ID: 1 (easy handle 0x7fb07c005800)
    > GET /files/U0AGHBZFU/F7FK1B2L9/ruuvitag-full.zip HTTP/2
    > Host: ruuvi.slack.com
    > User-Agent: curl/7.54.0
    > Accept: */*
    >
    * Connection state changed (MAX_CONCURRENT_STREAMS updated)!
    > HTTP/2 302
    > content-type: text/html
    > content-length: 0
    > location: https://ruuvi.slack.com/?redir=%2Ffiles%2FU0AGHBZFU%2FF7FK1B2L9%2Fruuvitag-full.zip
    > date: Wed, 11 Oct 2017 11:56:17 GMT
    > referrer-policy: no-referrer
    > server: Apache
    > set-cookie: b=e9god2rphvcwwkc08ccoowsgg; expires=Mon, 11-Oct-2027 11:56:17 GMT; Max-Age=315532800; path=/; domain=.slack.com
    > strict-transport-security: max-age=31536000; includeSubDomains; preload
    > vary: Accept-Encoding
    > x-frame-options: SAMEORIGIN
    > x-robots-tag: noindex
    > x-slack-backend: h
    > x-cache: Miss from cloudfront
    > via: 1.1 ac094a1c1bf8cbfbb98e93fa2b2431c0.cloudfront.net (CloudFront)
    > x-amz-cf-id: 4-Uk7ft…-v  changes with every retrival AmazonCloudFront
    >
    * Connection #0 to host ruuvi.slack.com left intact
    

    IPv6

    curl will connect with IPv6 when a host lookup returns an IPv6 address and fall back to IPv4 if the connection fails. The --ipv4 and --ipv6 specifies which address to use when both are available. IPv6 addresses can also be specified directly in URLs using the syntax: http://[2001:1890:1112:1::20]/overview.html

    When this style is used -g is required.
    Link local and site local addresses including a scope identifier, such as fe80::1234%1, may be used, the scope portion must be numeric and the percent character must be URL escaped as %25.
    example: sftp://[fe80::1234%251]/

    IPv6 addresses provided other than in URLs (e.g. to the --proxy, --interface or --ftp-port options) should not be encoded.

    URL detailed description in RFC 3986.

    Firewalls

    Attempts to access
    Blocked IN=eth1 OUT= MAC=20:c0:47:c2:a8:a4:f4:b5:2f:05:38:c7:08:00 
        SRC=15.72.34.52 DST=10.3.22.4 LEN=121 TOS=00 PREC=0x00 TTL=56 ID=16333 DF 
        PROTO=TCP SPT=443 DPT=13952 SEQ=1794043383 ACK=2151756732 WINDOW=23 ACK PSH URGP=0 MARK=0 
    

    Errors

    curl: (67) Access denied: 530

    curl_getdate

    Convert a date string to number of seconds

    time_t curl_getdate(char *datestring, time_t *now ); Number of seconds since the Epoch, January 1st 1970 00:00:00 in the UTC time zone, for the date and time that the datestring parameter specifies. PARSING DATES AND TIMES A "date" is a string containing several items separated by whitespace. The order of the items is immaterial. A date string may contain
    YYYYMMDD  
    calendar date Month names are three-letter english abbreviations, numbers can be zero-prefixed and the year may use 2 or 4 digits. Examples: 06 Nov 1994, 06-Nov-94 and Nov-94 6.
    HH:MM:SS Default is 00:00:00. Example: 18:19:21.
    time zone international time zone. Relative time compared to UTC. Supported formats include: -1200, MST, +0100.
    day of the week items Spelled out in full english: `Sunday', `Monday', etc or abbreviated to their first three letters.

    EXAMPLES

    Sun, 06 Nov 1994 08:49:37 GMT Sunday, 06-Nov-94 08:49:37 GMT Sun Nov 6 08:49:37 1994 06 Nov 1994 08:49:37 GMT 06-Nov-94 08:49:37 GMT Nov 6 08:49:37 1994 06 Nov 1994 08:49:37 06-Nov-94 08:49:37 1994 Nov 6 08:49:37 GMT 08:49:37 06-Nov-94 Sunday 94 6 Nov 08:49:37 1994 Nov 6 06-Nov-94 Sun Nov 6 94 1994.Nov.6 Sun/Nov/6/94/GMT Sun, 06 Nov 1994 08:49:37 CET 06 Nov 1994 08:49:37 EST Sun, 12 Sep 2004 15:05:58 -0700 Sat, 11 Sep 2004 21:32:11 +0200 20040912 15:05:58 -0700 20040911 +0200

    STANDARDS

    Handles formats specified in RFC 822 (including update RFC 1123) using time zone name or time zone delta and RFC 850 (obsoleted by RFC 1036) and ANSI C's asctime() format. These formats are the only ones permitted by RFC 7231 for HTTP applications.