- slide 1 of 4
Wget first appeared in 1996 for the UNIX environment. Wget was the precursor of the Bit Torrent protocol so it was adopted early on as the defacto-standard means to download files in UNIX and then Linux. Since then wget has been ported to Windows, Mac, OpenVMS, and AmigoOS. Wget was written in portable C and has a graphical front end GWget.
- slide 2 of 4
Download resuming: If your download is interrupted (such as from a lost connection), wget will attempt to resume the download from where it left off.
Recursive downloading: Wget can download complete directories and the files. This feature works well when downloading entire websites for offline viewing.
Non-interactive Downloading: Wget does not require user interaction for downloads, so much so that a user can log off their account and the download will continue.
- Proxy support
- Persistent HTTP connections
- IPv6 support
- SSL/TLS support
- 2 Gig + file size support
- Download speed throttling
- slide 3 of 4
Wget is used, like all command line tools, from the command line. So to take advantage of wget's power you must first open up a terminal window (such as aterm, eterm, xterm, or gnome-terminal). With the terminal window open the basic usage of the command is:
Where LINK_TO_DOWNLOAD_FILE is the actual url to the file you want to download. The url doesn't have to be in the form of http because wget can also download from ftp sources as well.
Say there is a particular website (such as Brighthub) you want to be able to view offline. To do this you would enter wget as such:
wget -r -l 0 //www.brighthub.com
In the above example you can see the following arguments:
- r - Means to recursively download.
- l - Sets a limit on the depth of download. Using 0 for this argument tells wget there is no limit
Let's say your download was interrupted for some reason. To attempt to pick it up where wget left off you would issue the command:
wget -c SAME_LINK_THAT_WAS_PREVIOUSLY_ATTEMPTED
Where SAME_LINK_THAT_WAS_PREVIOUSLY_ATTEMPTED is the exact link that was interrupted.
- slide 4 of 4
I use wget on a daily basis to download everything from distribution isos to work files. It's an incredibly simple work horse application to use. So instead of clicking on that download link from inside of your browser, copy and paste that url into a wget command so you can feel safe about your download.
Linux Command Line
- Linux Command Line: Introduction
- Linux Command Line: ls
- Linux Command Line: cd
- Linux Command Line: mkdir
- Linux Command Line: df
- Linux Command Line: ln
- Linux Command Line: top
- Linux Command Line: mount/umount
- Linux Command Line: Cron/Crontab
- Linux Command Line: chmod
- Linux Command Line: wget
- Linux Command Line: cat
- Linux Command Line: grep
- Linux Command Line: dd
- Linux Command Line: sudo
- Linux Command Line: startx
- Linux Command Line: adduser
- Linux Command Line: at
- Linux Command Line: aterm
- Linux Command Line: nano
- Linux Command Line: hostname