eget
Download files from the internet
TLDR
Download a prebuilt binary for the current system from a repository on GitHub
Download from a URL
Specify the location to place the downloaded files
Specify a Git tag instead of using the latest version
Install the latest pre-release instead of the latest stable version
Only download the asset, skipping extraction
Only download if there is a newer release then the currently downloaded version
SYNOPSIS
wget [options] [URL]...
PARAMETERS
-b, --background
Go to background after startup.
-o
Log all output to
-O
Write documents to
-c, --continue
Resume getting a partially downloaded file.
-n, --no-clobber
Don't overwrite files.
-q, --quiet
Quiet mode (no output).
-v, --verbose
Be verbose (default).
-P
Save files to
-U
Identify as
-w
Wait
-r, --recursive
Turn on recursive retrieving.
-l
Maximum recursion depth (inf or 0 for infinite).
-A
Comma-separated list of accepted extensions.
-R
Comma-separated list of rejected extensions.
DESCRIPTION
wget is a command-line utility for retrieving files using HTTP, HTTPS, and FTP protocols. It's non-interactive, meaning it can work in the background without user input, which makes it ideal for scripting.
wget is highly configurable, allowing users to specify download directories, control bandwidth usage, resume interrupted downloads, mirror websites, and much more. It's a robust tool for automating file retrieval tasks and is commonly used in server environments for downloading software packages, website backups, and other large files. Wget attempts to respect the robots.txt file of webservers.
ROBOTS.TXT
Wget respects the robots.txt file when mirroring or recursively downloading websites. The robots.txt file is a standard text file that webmasters use to instruct robots (such as web crawlers) about which parts of their site should not be processed or scanned. By default, Wget honors these directives to avoid overloading servers and respecting the wishes of the website owner.
HISTORY
wget was initially developed by Hrvoje Nikšić in 1995. It was designed as a free alternative to proprietary download managers. Over the years, it has become a standard tool on Unix-like systems and is widely used for automating file downloads and mirroring websites. It has seen continuous development and updates to support new protocols and features, becoming a crucial component of many system administration and automation workflows.