This paper to introduce CentOS in wget application, including the introduction of CentOS wget package aspect background knowledge wget download tool is a command line.
For Linux users, use it almost every day.
Here we will introduce several useful CentOS wget tips that can make you more efficient and flexible use CentOS wget.
CentOS wget tips
$ CentOS wget -r -np -nd http://example.com/packages/ This command can be downloaded on the website http://example.com packages directory of all files. Among them, the role of -np is not the parent directory traversal, -nd represent not native to re-create the directory structure.
$ CentOS wget-r -np -nd --accept = iso http://example.com/centos-5/i386/ command similar to the previous one, but added an extra --accept = iso option, which only indicates CentOS wget Download the i386 directory of all files with the extension iso. You can also specify multiple extensions, you can simply enter a comma-separated.
$ CentOS wget -i filename.txt This command is commonly used in the case of bulk download, all you need to download the file address into the filename.txt, and then CentOS wget will automatically download all your files.
$ CentOS wget -c http://example.com/really-big-file.iso role specified here -c option is HTTP.
$ CentOS wget -m -k (-H) http://www.example.com/ This command can be used to mirror a site, CentOS wget link will be converted. If the image is on the site of another site, you can use the -H option.
CentOS wget Guide
Is a CentOS wget to automatically download files from the Internet free tool. It supports HTTP, HTTPS and FTP protocols, you can use the HTTP proxy. The so-called automatic download means, CentOS wget can be performed in the background after the user exits the system. This means that you can log in and start a CentOS wget to download the task, and then exit the system, CentOS wget will be executed until the task is completed, relative to most other browsers require users to download large amounts of data has been participation in the background, which eliminates the need for a great deal of trouble.
wget can follow links on HTML pages in sequence to create a remote server to download a local version, completely rebuild the directory structure of the original site. This in turn is often referred to as "recursive downloading." Recursive download time, wget follow the Robot Exclusion Standard (/robots.txt). Wget can be downloaded at the same time, will be converted into links point to a local file, to facilitate off-line browsing.
wget is very stable, it is unstable and has a strong adaptability network bandwidth in very narrow circumstances. If the download fails due to network, wget will keep trying until the entire file has been downloaded. If the download process is interrupted by a server, it will be linked to the server from where it continues to download again. This is downloaded from the server that define a link-time large files very useful.
wget common use wget to use format
Usage: wget [OPTION] ... [URL] ...
Mirror sites do with wget: wget -r -p -np -k http://dsec.pku.edu.cn/~usr_name/# or wget -m http://www.tldp.org/LDP/abs/html / downloading a partially downloaded file on an unstable network, and download the idle period
wget -t 0 -w 31 -c http://dsec.pku.edu.cn/BBC.avi -o down.log & # filelist or read from a list of files to be downloaded
wget -t 0 -w 31 -c -B ftp://dsec.pku.edu.cn/linuxsoft -i filelist.txt -o down.log &
The above code can also be used in the network is idle time to download. My usage is: mozilla is not convenient at that time in the download URL link to copy and paste it into memory file filelist.txt, and in the evening before going out to a second system to execute the above code.
Use proxy to download wget -Y on -p -k https://sourceforge.net/projects/wvware/ # agent can set the environment variable or wgetrc file export PROXY proxy settings in the environment variable = http: //211.90 .168.94: 8080 / # set proxy in ~ / .wgetrc in
http_proxy = http://proxy.yoyodyne.com:18023/
ftp_proxy = http://proxy.yoyodyne.com:18023/
wget options category list starts
-V, --version Display version of wget exit
-h, --help print syntax help
-b, after --background start into the background
-e, --execute = COMMAND execute `.wgetrc 'command of the form, wgetrc format see / etc / wgetrc or ~ / .wgetrc
Recording and input file
-o, --output-file = FILE write to FILE file recording
-a, --append-output = FILE append records to the file FILE
-d, --debug Print debugging output
-q, --quiet Quiet mode (no output)
-v, --verbose verbose mode (this is the default setting)
-nv, --non-verbose turn off verbose mode, but not silent
-i, --input-file = FILE download URLs appear in the file FILE
-F, --force-Html input file as HTML format files treat
-B, --base = URL prefix URL relative links appear as in -F -i parameter specifies the file
--sslcertfile = FILE optional client certificate
--sslcertkey = KEYFILE KEYFILE optional client certificate
--egd-file = FILE specify the EGD socket file name
--bind-address = ADDRESS Specifies the local use address (host name or IP, when there are multiple local IP or use of the name)
-t, --tries = NUMBER set the maximum number of attempts to link (0 for unlimited).
-O --output-Document = FILE write the document to a file FILE
-nc, --no-clobber do not overwrite the existing file or use the prefix #
-c, --continue then download files without downloading
--progress = TYPE set the progress bar marker
-N, --timestamping Do not download the file again unless newer than local file
Response -S, --server-response print server
--spider not download anything
-T, --timeout = SECONDS set the Response Timeout Seconds
SECONDS seconds interval between -w, --wait = SECONDS two attempts
--waitretry = SECONDS between relink wait 1 ... SECONDS seconds
--random-wait wait between downloads 0 ... 2 * WAIT seconds
-Y, --proxy = On / off to open or close the proxy
-Q, --quota = NUMBER set the download size limit
--limit-rate = RATE Limited download transfer rate
table of Contents
-nd --no-directories do not create directories
-x, --force-directories force creation of directories
-nH, --no-host-directories do not create host directories
-P, --directory-Prefix = PREFIX save files to the directory PREFIX / ...
--cut-dirs = NUMBER ignore NUMBER remote directory layer
--http-user = USER set HTTP user name USER.
--http-passwd = PASS set http password to PASS.
-C, --cache = On / off Enable / Disable server-side data cache (allowed under normal circumstances).
-E, --html-Extension of all text / html documents with .html extension to save
--ignore-length ignore `Content-Length 'header field
--header = STRING insert STRING string in the headers
--proxy-user = USER set proxy user name USER
--proxy-passwd = PASS set PASS as proxy password
--referer = URL contained in the HTTP request `Referer: URL 'head
-s, --save-headers save the HTTP headers to file
-U, --user-Agent = AGENT set the name of the agent as AGENT instead of Wget / VERSION.
--no-http-keep-alive HTTP activity Close Link (always link).
--load-cookies = FILE load cookie from file FILE in before the start of the session
--save-cookies = FILE save cookies at the end of the session after the file FILE
-nr, --dont-remove-listing do not remove `.listing 'file
globbing mechanism -g, --glob = on / off to open or close a file name
--passive-ftp use passive transfer mode (the default value).
--active-ftp use active transport mode
--retr-symlinks recursive when the link to the file (rather than a directory)
-r, --recursive recursive download - caution!
-l, --level = NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after after the completion of a partial delete files now
-k, --convert-links convert non - relative links to relative links
-K, --backup-Converted before converting file X, will be the backup X.orig
-m, --mirror equivalent to -r -N -l inf -nr.
-p, --page-requisites Download show all images HTML files
Recursive download contains and does not contain (accept / reject)
-A, --accept = LIST semicolon-separated list of accepted extensions
List -R, --reject = LIST semicolon delimited unacceptable extension
-D, --domains = LIST semicolon-separated list of accepted domains
--exclude-domains = LIST list of domains separated by a semicolon not accepted
--follow-ftp track FTP links in HTML documents
List --follow-tags = LIST semicolon delimited tracked HTML tags
List -G, --ignore-tags = LIST semicolon delimited ignored HTML tags
-H, --span-Hosts when recursion to external host
-L, --relative Just tracking relative links
-I, --include-Directories = LIST list of allowed directories
-X, --exclude-Directories = LIST list of directories is not included
-np, --no-parent Do not be traced back to the parent directory
NOTE: To stop downloading, Ctrl + C.
Summary: CentOS wget is an automatic download files from the Internet free tool. It supports HTTP, HTTPS and FTP protocols, you can use the HTTP proxy. The so-called automatic download means, CentOS wget can be performed in the background after the user exits the system. This means that you can log in and start a CentOS wget to download the task, and then exit the system, CentOS wget will be executed until the task is completed, relative to most other browsers require users to download large amounts of data has been participation in the background, which eliminates the need for a great deal of trouble.