Pin Me

Linux Command Line: wget

written by: jlwallen•edited by: J. F. Amprimoz•updated: 7/4/2011

There are many ways to download files. But there is only one smart way to download from the command line - wget. The wget tool is a non-interactive network download tool that can download single files, recursively download entire directories, and even follow links.

  • slide 1 of 4

    History

    Wget first appeared in 1996 for the UNIX environment. Wget was the precursor of the Bit Torrent protocol so it was adopted early on as the defacto-standard means to download files in UNIX and then Linux. Since then wget has been ported to Windows, Mac, OpenVMS, and AmigoOS. Wget was written in portable C and has a graphical front end GWget.

  • slide 2 of 4

    Features

    Download resuming: If your download is interrupted (such as from a lost connection), wget will attempt to resume the download from where it left off.

    Recursive downloading: Wget can download complete directories and the files. This feature works well when downloading entire websites for offline viewing.

    Non-interactive Downloading: Wget does not require user interaction for downloads, so much so that a user can log off their account and the download will continue.

    Other features:

    • Proxy support
    • Persistent HTTP connections
    • IPv6 support
    • SSL/TLS support
    • 2 Gig + file size support
    • Download speed throttling

  • slide 3 of 4

    Basic Usage

    Wget is used, like all command line tools, from the command line. So to take advantage of wget's power you must first open up a terminal window (such as aterm, eterm, xterm, or gnome-terminal). With the terminal window open the basic usage of the command is:

    wget LINK_TO_DOWNLOAD_FILE

    Where LINK_TO_DOWNLOAD_FILE is the actual url to the file you want to download. The url doesn't have to be in the form of http because wget can also download from ftp sources as well.

    Say there is a particular website (such as Brighthub) you want to be able to view offline. To do this you would enter wget as such:

    wget -r -l 0 http://www.brighthub.com

    In the above example you can see the following arguments:

    • r - Means to recursively download.
    • l - Sets a limit on the depth of download. Using 0 for this argument tells wget there is no limit

    Let's say your download was interrupted for some reason. To attempt to pick it up where wget left off you would issue the command:

    wget -c SAME_LINK_THAT_WAS_PREVIOUSLY_ATTEMPTED

    Where SAME_LINK_THAT_WAS_PREVIOUSLY_ATTEMPTED is the exact link that was interrupted.

  • slide 4 of 4

    Final Thoughts

    I use wget on a daily basis to download everything from distribution isos to work files. It's an incredibly simple work horse application to use. So instead of clicking on that download link from inside of your browser, copy and paste that url into a wget command so you can feel safe about your download.