LinuxCommandLibrary

eget

Download files from the internet

TLDR

Download a prebuilt binary for the current system from a repository on GitHub

$ eget [zyedidia/micro]
copy

Download from a URL
$ eget [https://go.dev/dl/go1.17.5.linux-amd64.tar.gz]
copy

Specify the location to place the downloaded files
$ eget [zyedidia/micro] --to=[path/to/directory]
copy

Specify a Git tag instead of using the latest version
$ eget [zyedidia/micro] --tag=[v2.0.10]
copy

Install the latest pre-release instead of the latest stable version
$ eget [zyedidia/micro] --pre-release
copy

Only download the asset, skipping extraction
$ eget [zyedidia/micro] --download-only
copy

Only download if there is a newer release then the currently downloaded version
$ eget [zyedidia/micro] --upgrade-only
copy

SYNOPSIS

wget [options] [URL]...

PARAMETERS

-b, --background
    Go to background after startup.

-o , --output-file=
    Log all output to .

-O , --output-document=
    Write documents to .

-c, --continue
    Resume getting a partially downloaded file.

-n, --no-clobber
    Don't overwrite files.

-q, --quiet
    Quiet mode (no output).

-v, --verbose
    Be verbose (default).

-P , --directory-prefix=
    Save files to .

-U , --user-agent=
    Identify as instead of Wget/VERSION.

-w , --wait=
    Wait between retrievals.

-r, --recursive
    Turn on recursive retrieving.

-l , --level=
    Maximum recursion depth (inf or 0 for infinite).

-A , --accept=
    Comma-separated list of accepted extensions.

-R , --reject=
    Comma-separated list of rejected extensions.

DESCRIPTION

wget is a command-line utility for retrieving files using HTTP, HTTPS, and FTP protocols. It's non-interactive, meaning it can work in the background without user input, which makes it ideal for scripting.
wget is highly configurable, allowing users to specify download directories, control bandwidth usage, resume interrupted downloads, mirror websites, and much more. It's a robust tool for automating file retrieval tasks and is commonly used in server environments for downloading software packages, website backups, and other large files. Wget attempts to respect the robots.txt file of webservers.

ROBOTS.TXT

Wget respects the robots.txt file when mirroring or recursively downloading websites. The robots.txt file is a standard text file that webmasters use to instruct robots (such as web crawlers) about which parts of their site should not be processed or scanned. By default, Wget honors these directives to avoid overloading servers and respecting the wishes of the website owner.

HISTORY

wget was initially developed by Hrvoje Nikšić in 1995. It was designed as a free alternative to proprietary download managers. Over the years, it has become a standard tool on Unix-like systems and is widely used for automating file downloads and mirroring websites. It has seen continuous development and updates to support new protocols and features, becoming a crucial component of many system administration and automation workflows.

SEE ALSO

curl(1), ftp(1)

Copied to clipboard