Detailed explanation of Linux curl command
Command: curl
In Linux, curl is a file transfer tool that uses URL rules to work under the command line. It can be said to be a very powerful http command line tool. It supports file uploading and downloading, and is a comprehensive transmission tool, but traditionally, it is customary to call url a download tool.
Syntax: # curl [option] [url]
Common parameters:
-A/--user-agent <string> Set the user agent to send to the server -b/--cookie <name=string/file> cookie string or file read position -c/--cookie-jar <file> After the operation cookie write to this file -C/--continue-at <offset> Continue from a breakpoint -D/--dump-header <file> Bundle header information is written to this file -e/--referer source url -f/--fail Don't show when connection fails http mistake -o/--output write output to this file -O/--remote-name Write output to this file, preserving the filename of the remote file -r/--range <range> retrieved from HTTP/1.1 or FTP server byte range -s/--silent silent mode. don't output anything -T/--upload-file <file> upload files -u/--user <user[:password]> Set the server user and password -w/--write-out [format] what output is done after -x/--proxy <host[:port]> use on the given port HTTP acting -#/--progress-bar progress bar showing current delivery status
example:
1. Basic usage
# curl http://www.linux.com
After execution, the html of www.linux.com will be displayed on the screen
Ps: Since the desktop is often not installed when installing linux, which also means that there is no browser, this method is often used to test whether a server can reach a website
2. Save the visited web pages
2.1: Save using the redirection function of linux
# curl http://www.linux.com >> linux.html
2.2: You can use curl's built-in option: -o (lowercase) to save web pages
$ curl -o linux.html http://www.linux.com
After the execution is completed, the following interface will be displayed. If 100% is displayed, it means the save is successful.
% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 79684 0 79684 0 0 3437k 0 --:--:-- --:--:-- --:--:-- 7781k
2.3: You can use curl's built-in option: -O (uppercase) to save files in web pages
It should be noted that the url behind here should be specific to a certain file, otherwise it will not be caught.
# curl -O http://www.linux.com/hello.sh
3. Test the return value of the webpage
# curl -o /dev/null -s -w %{http_code} www.linux.com
Ps: In scripts, this is a very common use to test whether a website is normal
4. Specify the proxy server and its port
Many times you need to use a proxy server to surf the Internet (for example, when you use a proxy server to surf the Internet or when someone else's IP address is blocked because you use curl to other websites), fortunately, curl supports setting a proxy by using the built-in option: -x
# curl -x 192.168.100.100:1080 http://www.linux.com
5,cookie
Some websites use cookie s to record session information. For browsers such as chrome, cookie information can be easily handled, but it can be easily handled by adding relevant parameters in curl
5.1: Save the cookie information in the http response. Built-in option:-c (lowercase)
# curl -c cookiec.txt http://www.linux.com
After execution, the cookie information is stored in cookiec.txt
5.2: Save the header information in the http response. Built-in option: -D
# curl -D cookied.txt http://www.linux.com
After execution, the cookie information is stored in cookied.txt
Note: The cookie generated by -c (lowercase) is different from the cookie in -D.
5.3: Use of cookie s
Many websites monitor your cookie information to determine whether you are visiting their website in accordance with the rules, so we need to use the saved cookie information. Built-in option: -b
# curl -b cookiec.txt http://www.linux.com
6. Mimic the browser
Some sites require a specific browser to access them, and some require a specific version. curl built-in option:-A allows us to specify the browser to visit the website
# curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com
In this way, the server side will think that it is accessed using IE8.0
7. Fake referer (hotlink)
Many servers will check the referer of http access to control access. For example, if you visit the home page first, and then visit the mailbox page on the home page, the referer address for accessing the mailbox here is the page address after successfully accessing the home page. It's a thief
The built-in option in curl: -e allows us to set the referer
# curl -e "www.linux.com" http://mail.linux.com
This will make the server think you are clicking on a link from www.linux.com
8. Download files
8.1: Use curl to download files.
#Use built-in option: -o (lowercase)
# curl -o dodo1.jpg http:www.linux.com/dodo1.JPG
#Use built-in option: -O (uppercase)
# curl -O http://www.linux.com/dodo1.JPG
This will save the file locally with the name on the server
8.2: Cyclic download
Sometimes the downloaded picture may be the same as the previous part of the name, but the last tail name is different
# curl -O http://www.linux.com/dodo[1-5].JPG
This will save dodo1, dodo2, dodo3, dodo4, dodo5 all
8.3: Download Rename
# curl -O http://www.linux.com/{hello,bb}/dodo[1-5].JPG
Because the file names in the downloaded hello and bb are dodo1, dodo2, dodo3, dodo4, dodo5. Therefore, the second download will overwrite the first download, so the file needs to be renamed.
# curl -o #1_#2.JPG http://www.linux.com/{hello,bb}/dodo[1-5].JPG
In this way, the downloaded file in hello/dodo1.JPG will become hello_dodo1.JPG, other files and so on, thus effectively avoiding the file from being overwritten
8.4: Chunked download
Sometimes the downloaded things will be relatively large, at this time we can download in segments. Use built-in option: -r
# curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG # curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1.JPG # curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG # cat dodo1_part* > dodo1.JPG
So you can view the content of dodo1.JPG
8.5: Download files via ftp
curl can download files through ftp, curl provides two syntaxes for downloading from ftp
# curl -O -u username:password ftp://www.linux.com/dodo1.JPG # curl -O ftp://username:password@www.linux.com/dodo1.JPG
8.6: Show download progress bar
# curl -# -O http://www.linux.com/dodo1.JPG
8.7: Download progress information will not be displayed
# curl -s -O http://www.linux.com/dodo1.JPG
9. Resume from a breakpoint
In windows, we can use software such as Thunder to resume uploading from a breakpoint. curl can also achieve the same effect through the built-in option:-C
If you suddenly disconnected during the process of downloading dodo1.JPG, you can use the following methods to resume the upload
# curl -C -O http://www.linux.com/dodo1.JPG
10. Upload files
curl can not only download files, but also upload files. Implemented by built-in option:-T
# curl -T dodo1.JPG -u username:password ftp://www.linux.com/img/
This uploads the file dodo1.JPG to the ftp server
11. Display crawl errors
# curl -f http://www.linux.com/error
Other parameters (translated as reproduced here):
-a/--append When uploading a file, append to the target file --anyauth "Any" authentication method can be used --basic use HTTP Basic authentication -B/--use-ascii use ASCII text transfer -d/--data <data> HTTP POST way to transmit data --data-ascii <data> by ascii The way post data --data-binary <data> in binary post data --negotiate use HTTP Authentication --digest Use digital authentication --disable-eprt Prohibited to use EPRT or LPRT --disable-epsv Prohibited to use EPSV --egd-file <file> for random data(SSL)set up EGD socket path --tcp-nodelay use TCP_NODELAY Options -E/--cert <cert[:passwd]> Client certificate file and password (SSL) --cert-type <type> Certificate file type (DER/PEM/ENG) (SSL) --key <key> private key file name (SSL) --key-type <type> Private key file type (DER/PEM/ENG) (SSL) --pass <pass> private key password (SSL) --engine <eng> Encryption engine used (SSL). "--engine list" for list --cacert <file> CA Certificate (SSL) --capath <directory> CA target (made using c_rehash) to verify peer against (SSL) --ciphers <list> SSL password --compressed The request to return is a compressed situation (using deflate or gzip) --connect-timeout <seconds> Set maximum request time --create-dirs Build a directory hierarchy of local directories --crlf upload is LF change into CRLF --ftp-create-dirs If the remote directory does not exist, create the remote directory --ftp-method [multicwd/nocwd/singlecwd] control CWD usage of --ftp-pasv use PASV/EPSV instead of port --ftp-skip-pasv-ip use PASV when,ignore the IP address --ftp-ssl try with SSL/TLS to carry out ftp data transmission --ftp-ssl-reqd request SSL/TLS to carry out ftp data transmission -F/--form <name=content> simulation http form submission data -form-string <name=string> simulation http form submission data -g/--globoff Disable URL sequence and scope usage{}and[] -G/--get by get way to send data -h/--help help -H/--header <line> Custom header information passed to the server --ignore-content-length Ignored HTTP length of header -i/--include include in output protocol header information -I/--head Only show document information -j/--junk-session-cookies Ignore when reading file session cookie --interface <interface> Use the specified network interface/address --krb4 <level> Use the specified security level krb4 -k/--insecure Allow without certificate to SSL site -K/--config Read the specified configuration file -l/--list-only list ftp file name in the directory --limit-rate <rate> Set transfer speed --local-port<NUM> Force the use of a local port number -m/--max-time <seconds> Set maximum transfer time --max-redirs <num> Set the maximum number of directories to read --max-filesize <bytes> Set the maximum amount of files to download -M/--manual Show full manual -n/--netrc from netrc Read username and password from file --netrc-optional use .netrc or URL to cover-n --ntlm use HTTP NTLM Authentication -N/--no-buffer Disable buffered output -p/--proxytunnel use HTTP acting --proxy-anyauth Choose any proxy authentication method --proxy-basic Use Basic Authentication on the Proxy --proxy-digest Use digital authentication on proxies --proxy-ntlm use on proxy ntlm Authentication -P/--ftp-port <address> use the port address instead of using PASV -Q/--quote <cmd> Send command to server before file transfer --range-file read( SSL)random file of -R/--remote-time Preserve remote file times when generating files locally --retry <num> The number of times to retry when there is a problem with the transmission --retry-delay <seconds> Set the retry interval when there is a problem with the transmission --retry-max-time <seconds> Set a maximum retry time when there is a problem with the transmission -S/--show-error Display error --socks4 <host[:port]> use socks4 Proxy given host and port --socks5 <host[:port]> use socks5 Proxy given host and port -t/--telnet-option <OPT=val> Telnet option settings --trace <file> to the specified file debug --trace-ascii <file> Like --tracked but not hex output --trace-time track/When verbose output, add timestamp --url <URL> Spet URL to work with -U/--proxy-user <user[:password]> Set proxy username and password -V/--version Display version information -X/--request <command> what command to specify -y/--speed-time The time it takes to give up the speed limit. Default is 30 -Y/--speed-limit Stop transmission speed limit, speed time'second -z/--time-cond Delivery time settings -0/--http1.0 use HTTP 1.0 -1/--tlsv1 use TLSv1(SSL) -2/--sslv2 use SSLv2 of( SSL) -3/--sslv3 in use SSLv3(SSL) --3p-quote like -Q for the source URL for 3rd party transfer --3p-url use url,make third-party transmissions --3p-user Use username and password for third-party transmission -4/--ipv4 use IP4 -6/--ipv6 use IP6