[--url]URL
required
not -u that's for user URL to fetch.
Can be in config.
May be used any number of times.
| --local-port num[-num] | local port numbers
| -o --output file| -
[--create-dirs]
Output to file or - for stdout.
#n in the file is replaced with the URL being fetched.
Example:
curl http://{one,two}.site.com --output "file_#1.txt"
or use several variables :
curl http://{site,host}.host[1-5].com -o "#1_#2"
Use as many times as the number of URLs
| -O --remote-name Output to a local file (instead of stdout) named with
the remote name in the current working directory.
| -# --progress-bar
Progress as a bar:
Without --progress-bar
Redirecting stdout via > or -o displays progress:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 32715 100 32715 0 0 184k 0 --:--:-- --:--:-- --:--:-- 190k
|
--silent
-s no progress meter or error messages.
| --show-error
-S
Show errors
| --stderr {file| - }
Redirect stderr to file or stdout with -
| -g --globoff {}[] are not treated as meta characters in URLs and must be encoded
{ = %7B, } = %7D, [ = %5B, ] = %5d
|
HTTP specific options: Receiving
| -D file --dump-header file Save headers to file .
HTTP/1.1 200 OK
Date: Mon, 10 Aug 2020 20:08:22 GMT
Server: Apache
Last-Modified: Sun, 12 Jul 2020 18:58:37 GMT
Accept-Ranges: bytes
Content-Length: 2975
Content-Type: text/html
Cookies from the headers could be read in a second invocation using the --cookie .
--cookie-jar is a better way to store cookies.
FTP: server response lines are considered "headers"
| -i --include Include headers with output. HTTP/1.1 200 OK^M
Cache-Control: no-cache, no-store^M
Pragma: no-cache^M
Content-Type: text/html; charset=utf-8^M
…
Strict-Transport-Security: max-age=2592000^M
^M
<!DOCTYPE html>
<head>
<title>
…
| -I --head Fetch the headers only.HTTP/1.1 200 OK
Cache-Control: no-cache, no-store
Pragma: no-cache
Content-Length: 32715
Content-Type: text/html; charset=utf-8
Expires: -1
Date: Fri, 06 Jan 2023 17:38:20 GMT
…
Reporting-Endpoints: wildapricot-csp-uel='https://csp.uel.wildapricot.com/report'
Strict-Transport-Security: max-age=2592000
| -L --location
Reissue the request on the new location if the page has moved
(indicated with a Location: header and a 3XX response code).
With --include header or --head , headers from all requested pages will be shown.
When authentication is used, curl only sends credentials to the initial host.
See --location-trusted
Limit the amount of redirects to follow by using --max-redirs
If the HTTP response was 301, 302, or 303 a redirect and which is not a GET (i.e. POST or PUT), executes the next request with a GET .
If the response code was any other 3xx code, curl will re-send the following request using the same method.
|
--location-trusted Like --location sends the name + password to hosts redirect to.
A security issue if the site redirects to a site to which sends authentication info (which is plaintext in the case of HTTP Basic authentication).
| --max-redirs num
maximum redirection-followings allowed with --location .
default: 50 . Don't use -1 to make it limitless.
| --post301do not convert POST requests to GETs when following a 301 redirection. only with --location
--post302do not convert POST requests to GETs when following a 302 redirection. only with --location
--compressed Request a compressed response and save uncompressed
| --tr-encoding " … uncompress the data while receiving it.
| --raw do not decode content or transfer encodings
| HTTP specific options: sending
| --form name=
{ content | @ | <} file | - }
-F
Emulate a form submission by issuing POST data using
Content-Type multipart/form-data .
@ attaches a file upload (ex: datafile=@input.dat ).
< gets the content from a file, which makes a text field.
For stdin use - .
curl -F "web=@index.html;type=text/html" url.com
or
curl -F "name=daniel;type=text/foo" url.com
Change the name of a file with filename=f[:… ]
curl -F "file=@localfile;filename=nameinpost" url.com
Can be used multiple times.
| --form-string name=string
Similar to --form except leading @ and < characters, and the ;type= string have no special meaning.
| -d --data @file|-|data
--data-acsii Sends data in a POST request, in the same way that a browser does when a form is submitted,
using the
content-type application/x-www-form-urlencoded
To URLencode the value of a form field use --data-urlencode.
If used more than once, data will be merged with a separating & .
For example: using -d name=daniel -d skill=lousy
to generate the post 'name=daniel&skill=lousy'.
Use @file to read the data from file or - from stdin.
The contents of the file must be URL-encoded.
Multiple files can also be specified.
| --data-urlencode data posts data, similar to the --data
and URL-encoding.
To be CGI-compliant, data should begin with a name followed by a separator and a content specification.
The data part can be passed to curl using :
[=]content URL-encode the content and pass that on. The = is not included in the data.
| name=content URL-encode the content part and pass that on. name must be previously URLencoded
| @filename load data from file (including newlines), URL-encode that data and pass it on in the POST.
| name@file load data from file (including newlines), URL-encode that data and pass it on in the POST.
name has an equal sign appended, resulting in name=urlencoded-file-content .
name is expected to be URL-encoded.
| | | | |
| --data-binary data posts data with no processing
When @ prefixes a filename Data is posted as with --data-ascii , except newlines are preserved and no conversions are done.
Used several times, they are append as described in--data.
-G --get data specified with --data or --data-binary to be used in a HTTP GET request .
The URL is seperated by a ? followed by the data.
With -I , the POST data will be appended to the URL with a HEAD request.
Only the first occurance is honored.
-H --header header Extra header to use with get.
Using a internal header replaces it.
Remove an internal header by giving a replacement without content, example: -H "Host:" .
To send a header with no-value, terminate it with a semicolon, example: -H "X-Custom-Header;".
See --user-agent and --referer .
Can be used multiple times to add/replace/remove multiple headers.
-A --user-agent agent string User-Agent string to send to the HTTP server.
? Some CGIs fail if this field isn't set to "Mozilla/4.0". ?
To encode blanks in the string, use quotes .
Can be set with -H,--header
| -e --referer URL Sends the "Referer Page" information to the HTTP server.
Can be set with --header .
With --location , append ;auto to the --referer
URL to set the previous URL when it follows a Location: header.
;auto can be used alone, without an initial --referer.
| -J --remote-header-name use the server-specified
Content-Disposition filename instead of extracting a filename from the URL.
| HTTP specific options: cookie handling
| -b --cookie file | name=value[;…]
Without =filename to use to read cookies and record incoming cookies in the cookie‑jar.
Useful with --location .
The format is plain HTTP headers
To store cookies, use --cookie-jar or --dump-header
| -c --cookie-jar file
Save cookies from file as well as cookies received from server after a completed operation.
If no cookies are known, no file will be written.
Use - to have cookies written to stdout.
If the cookie jar can't be created or written to the operation doesn't fail or report an error.
Use -v to output a warning
| -j --junk-session-c discard "session cookies" as if a new session is started.
|
Authentication
| --basicBasic authentication default.
| -u --user user:password name:password for authentication.
Overrides --netrc and --netrc-optional .
Force user name and password by specifying a single colon with : --user : .
| --digestHTTP Digest authentication.
Prevents the password from being sent in clear text. Use with --user
| -U --proxy-user user:password user name and password to use for proxy .
| --anyauth determine authentication method and use the most secure one.
Not recommended for uploads from stdin .
| --ntlm (HTTP) Enables NTLM authentication.
To enable NTLM for proxy authentication use --proxy-ntlm.
| -E
--cert certificate[:password]
(SSL) use the client certificate file
when getting a file with HTTPS, FTPS or another SSL-based protocol, PEM format.
A "certificate" file that is the private key and the private certificate concatenated! See
--cert and --key to specify them independently.
With NSS SSL library this
tells the nickname of the certificate to use within the
NSS database defined by the $SSL_DIR (or by
default /etc/pki/nssdb ). If the NSS PEM PKCS#11 module (libnsspem.so ) is available then PEM files may be loaded.
To use a file from the current directory, precede it with ./
| --delegation level
What should the server do regarding delegating when it comes to user credentials. with GSS/kerberos.
none Don't
| policy Delegates if and only if the OK-AS-DELEGATE flag is set in the Kerberos service ticket, which is a matter of realm policy.
| always
| | | |
| --ciphers list of ciphers
see SSL ciphers
NSS ciphers entry at NSSCipherSuite
| --engine name Select the OpenSSL crypto engine .
Use list to output a list of build-time supported engines. Build-time engines:
<none>
| --cert-type type
Type of provided certificate : PEM, DER and ENG . defauult PEM
| --cacert certificateFile use certificate file bundle to verify
the peer. Contains multiple CA certificates, in PEM format.
Overides the default file and $CURL_CA_BUNDLE the path to a CA cert bundle.
The windows version looks for
curl-ca-bundle.crt in the same directory as curl.exe, or in the Current Working Directory, or in any
folder along PATH.
With the NSS SSL library tells curl the nickname of the CA certificate to use within the
NSS database defined by the $SSL_DIR (or by
default /etc/pki/nssdb ). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded.
| --capath directory[:…]
Directory to verify the peer.
PEM format.
With OpenSSL, the directory must have been processed with c_rehash supplied with OpenSSL.
Makes SSL-connections more efficient than using --cacert file containing many entries.
| --crlfile file
(HTTPS/FTPS) Certificate Revocation List in a file using PEM format with a certificates that are revoked.
|
FTP
| -l --list-only directory listing nameonly.
| -I --head include size and modification time
| -T --upload-file [file]
Transfers the local file to the remote URL using the same name if there is no file in the URL.
Use a trailing / on the last directory to use the same file name
Use a -T for each URL on the command line.
Each -T + URL pair specifies what to upload where.
Supports "globbing" of the file argument, permitting the uploading of multiple files to a single URL by using the same URL globbing style supported in the URL, For Example:
curl -T "{file1,file2}" http://www.uploadtothissite.com
or
curl -T "img[1-1000].png" ftp://ftp.picmania.com/upload/
Use - for stdin.
Use . for stdin in non-blocking mode to allow reading server output while stdin is being uploaded.
With HTTP(S) PUT will be used.
| -R --remote-time retains file timestamp
| -C
--continue-at {-|offset} Continue a file transfer
With - output/input files determine continuation point.
With offset continue at that point.
| -a --append with upload, create or append
| --crlf Convert LF to CRLF in upload. ( as in linux to MS Windows)
| --use-asciii
-BsicASCII translating line terminators as needed.
Also appending ;type=A to the URL.
causes data sent to stdout to be in text mode for win32 systems.
| --ftp-create-dirs create missing directories.
| --ftp-method [method] method to reach a file in a subdirectory.
multicwd a CWD command for each path part.
| singlecwd one CWD with the full directory
| nocwd give a full path.
| | | |
To save the file in a different directory change current working directory first.
Use as many times as the number of URLs/.
| -K --config config file Default curl.rc
Parameters must be specified on the same line, separated by whitespace, colon, equals sign or any combination (preferred equals ).
If a parameter contains whitespace, enclose it quotes.
Escape sequences are : \\, \", \t, \n, \r and \v .
'#' is a comment.
One option per line
filename as '-' to read from stdin.
To specify a URL in the config file use --url :
url = "http://curl.haxx.se/docs/"
Long option names can be given without double dashes unless -q is used, checks for a config file
1) "home dir": $CURL_HOME and then $HOME
It uses getpwuid() on UNIX-like systems (which returns the home dir given the current user ).
On Windows, checks for APPDATA , or the %USERPROFILE%\Application Data '
2) On windows, if there is no _curlrc file in the home dir, checks for one in the same dir the curl executable is .
On UNIX-like systems, it will to load .curlrc from the determined home dir.
# --- Example file --# this is a comment
url = "curl.haxx.se"
output = "curlhere.html"
user-agent = "superagent/1.0"
# and fetch another URL too
url = "curl.haxx.se/docs/manpage.html"
-O
referer = "http://nowhereatall.com/"
Can be used multiple times to load multiple config files.
| -q as the first parameter on the command line, won't read curl.rc .
|
FTP port negotitation
passive(host says what port to use for data transer) or active (client, i.e. curl decides)
| --ftp-pasv Use passive mode. Default, overrides --ftp-port
Tries EPSV first and then PASV , unless--disable-epsv is used.
Passive mode: server sets up an IP address and port
| --ftp-skip-pasv-ip Ignore IP address the server offers in response to PASV.
| --disable-eprt --no-eprt
--eprt
| disable/enable EPRT and LPRT address and port specifing commands.
Normally first uses EPRT(extended), then LPRT(long) finally PORT,
Disabling EPRT only changes the active behavior. to switch to passive mode to not use --ftp-port or force it with --ftp-pasv.
--disable-epsv --no-epsv
--epsv | disable/enable EPSV when doing passive transfers.
Disabling EPSV only changes the passive behavior. To switch to active mode use --ftp-port.
| -P --ftp-port address Use active mode.
Server to connect to the client's address and port.
address :
interface Ex: "eth0"
| IP address Ex: "192.168.10.1"
| host name Ex: "my.host.domain"
| - same IP address used for the control connection
| | | | |
Disable the use of PORT with --ftp-pasv.
Disable the attempt to use EPRT instead of PORT by using --disable-eprt . EPRT is really PORT.
Append :[S_port][-E_port] to the address, to specify the port range. A single number will fail if that port is not available.
--ftp-pret send a PRET before PASV (and EPSV).
Some servers ( drftpd ) require command for directory and transfers in PASV mode.
|
--ftp-alternative-to-user command |
If authentication with user:pass fails, send command .
-n --netrc Use ~/.netrc (_netrc on MSWindows) for name and password.
Does not support macros.
| --netrc-optional
| --netrc-file fid path to .netrc
Overrides --netrc and --netrc-optional
| --ftp-account [acctname]
Response to "account data". Sent using ACCT .
| Security for login , commands and trnasfer
| -k --insecure (SSL) explicitly allows "insecure" SSL connections and transfers.
SSL connections are secure by using the CA certificate bundle fail unless--insecure is used.
See http://curl.haxx.se/docs/sslcerts.html
| --ftp-ssl-ccc authenticate with SSL then
CCC (Clear Command Channel)
Allows NAT routers to follow the transaction.
| --ftp-ssl-control SSL/TLS for the login, clear for transfer.
| --key keyfile (SSL/SSH) Private key file name.
| --key-type typePrivate key file type. type --key provided private key is. DER, PEM, and ENG .. default PEM
| --[no-]sessionid (SSL) Disable use of SSL session-ID caching.
Use --sessionid to enforce session-ID caching.
--pass phrase (SSL/SSH) Passphrase for the private key.
| --pubkey file (SSH) Public key file.
| |
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |