Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Linux \ Linux tool curl and wget advanced use     - Ubuntu 14.04.1 LTS compile and install the new kernel (Linux)

- Linux virtual machine settings network, hostname ssh access (Linux)

- Top 10: HTML5, JavaScript 3D game engine and framework (Linux)

- To build a private Docker registry (Server)

- grep command output highlighted word (Linux)

- To learn from scratch OpenWrt perfect tutorial (Linux)

- Linux Command Study Notes (Linux)

- Analysis of memory mapping process in Linux x86-64 mode (Linux)

- How to choose the first programming language based on the life you want (Programming)

- How do I switch from NetworkManager to systemd-network on Linux (Linux)

- Oracle 12c users create (Database)

- Linux Study of --CentOS create local yum repository (Linux)

- PostgreSQL procedural language learning (Database)

- Linux, modify the fstab file system can not start causing solve one case (Linux)

- Kickstart automated installation and deployment RHEL 7.0 (Linux)

- Lua regex (string function) (Programming)

- Linux command line ten magic usage (Linux)

- Linux platform host to prevent hacking skills (Linux)

- Apache site default home page settings (Server)

- Binary Tree Traversal (Linux)

 
         
  Linux tool curl and wget advanced use
     
  Add Date : 2018-11-21      
         
         
         
  1, Curl (file transfer tool)

Common parameters are as follows:
-c, - cookie-jar: the cookie is written to the file
-b, - cookie: cookie read from a file
-C, - Continue-at: HTTP
-d, - data: http post transmission of data
-D, - Dump-header: The header information is written to the file
-F, - From: http analog expression data submitted
-s, - slient: reduce output
-o, - output: the output to a file
-O, - Remote-name: according to the file name on the server, there is a local
--l, - head: only returns the header information
-u, - user [user: pass]: set http user authentication and password
-T, - Upload-file: Upload file
-e, - referer: Specifies the reference address
-x, - proxy: Specify the proxy server address and port
-w, - write-out: output specification format content
--retry: retries
--connect-timeout: Specifies the maximum time trying to connect / s

Example of use:
Example 1: crawls pages to a specified file, if there is distortion can use iconv transcoding
# Curl -o baidu.html www.baidu.com
# Curl -s -o baidu.html www.baidu.com | iconv -f utf-8 # to reduce output
Example 2: Analog browser header (user-agent)
# Curl -A "Mozilla / 4.0 (compatible; MSIE 6.0; Windows NT 5.0)" www.baidu.com
Example 3: Handling redirects
# Curl -L http://192.168.1.100/301.php # default curl is not processed redirection
Example 4: Analog user login, save cookie information to cookies.txt file, then use the cookie Login
# Curl -c ./cookies.txt -F NAME = user -F PWD = *** URL #NAME form properties and PWD are different, basically different from each site
# Curl -b ./cookies.txt -o URL
Example 5: Get the HTTP response headers headers
# Curl -I http://www.baidu.com
# Curl -D ./header.txt http://www.baidu.com # will be saved to a file headers
Example 6: Access the HTTP authentication page
# Curl -u user: pass URL
Example 7: upload and download files via ftp
# Curl -T filename ftp: // user: pass @ ip / docs # Upload
# Curl -O ftp: // user: pass @ ip / filename # Download

2, wget (download tool)

Common parameters are as follows:
2.1 Startup Parameters
-V, - Version: displays the version number
-h, - help: View help
-b, - background: After starting into the background
2.2 Logging and input file parameters
-o, - output-file = file: the file records are written to a file
-a, - append-output = file: the file in append records to file
-i, - input-file = file: read from the file download url
2.3 download parameters
-bind-address = address: Specifies the local use address
-t, -tries = number: Sets the maximum number of attempts to connect
-c, -continue: then do not download files downloaded
-O, -output-Document = file: file will be written to the file download
-spider: Do not download files
-T, -timeout = Sec: set the response timeout time
-w, -wait = sec: the interval between two attempts
--limit-rate = rate: limit download speed
-progress = type: set the progress bar
2.4 Parameter directory
-P, -directory-Prefix = prefix: Save the file to the specified directory
2.5 HTTP parameters
-http-user = user: set http user name
-http-passwd = pass: for http password
-U, - User-agent = agent: Agent camouflage
-no-http-keep-alive: Close http active links to become permanent link
-cookies = off: do not use cookies
-load-cookies = file: Before beginning the session to load cookies from file file
-save-cookies = file: at the end of the session to a file stored cookies file
2.6 FTP parameters
-passive-ftp: default value, use passive mode
-active-ftp: use active mode
2.7 Recursive Download exclusion parameter
-A, - Accept = list: semicolon has been downloaded extensions list
-R, - Reject = list: semicolon not downloaded extensions list
-D, - Domains = list: semicolon is a list of domain download
--exclude-domains = list: semicolon is not a list of domains Download

Example of use:
Example 1: download a single file to the current directory, you can also specify the download directory -P
# Wgethttp: //nginx.org/download/nginx-1.8.0.tar.gz
Example 2: For unstable network users can use the -c and --tries parameters to ensure the download is complete
# Wget --tries = 20 -c http://nginx.org/download/nginx-1.8.0.tar.gz
Example 3: When you download large files, we can go into the background download, then you will generate wget-log file to save the download progress
# Wget -b http://nginx.org/download/nginx-1.8.0.tar.gz
Example 4: You can use -spider parameters determine whether a valid URL
# Wget --spider http://nginx.org/download/nginx-1.8.0.tar.gz
Example 5: Automatically download files from multiple links
# Cat url_list.txt # create a URL file
http://nginx.org/download/nginx-1.8.0.tar.gz
http://nginx.org/download/nginx-1.6.3.tar.gz
# Wget -i url_list.txt
Example 6: download speed limits
# Wget --limit-rate = 1m http://nginx.org/download/nginx-1.8.0.tar.gz
Example 7: landing ftp download file
# Wget --ftp-user = user --ftp-password = pass ftp: // ip / filename
     
         
         
         
  More:      
 
- Php and MySQL command add to the environment variable method in Linux system (Linux)
- C # DateTime structure common method (Programming)
- Oracle 11G using DG Broker create DataGuard (Database)
- MongoDB 3.2 Cluster Setup (Database)
- Linux using TCP-Wrapper Service Management (Linux)
- SME Linux network security policy server security (Linux)
- Linux system Iptables Firewall User Manual (Linux)
- NAT and firewall under Linux (Linux)
- MySQL Error Code Complete (Database)
- How to create a cloud encrypted file system in Linux systems (Linux)
- Use apt-p2p up a local Debian package cache (Server)
- About Linux iptables firewall interview questions and answers (Linux)
- RedHat Linux 6.4 install Oracle 10g error (Database)
- Linux system Perl Lite netstat (Linux)
- IntelliJ IDEA common list of shortcuts (Linux)
- Configuration based on open source Lucene Java development environment (Server)
- Grading defense against Linux server attacks (Linux)
- JavaScript in null undefined summary (Linux)
- To install and deploy PHP environment under the CentOS (Server)
- Debian 7 and Debian 8 users how to install Oracle Java 8 (Linux)
     
           
     
  CopyRight 2002-2022 newfreesoft.com, All Rights Reserved.