LinuxCommandLibrary

axel

Download files using multiple connections

TLDR

Download a URL to a file

$ axel [url]
copy

Download and specify an output file
$ axel [url] [[-o|--output]] [path/to/file]
copy

Download with a specific number connections
$ axel [[-n|--num-connections]] [connections_num] [url]
copy

Search for mirrors
$ axel [[-S|--search]] [mirrors_num] [url]
copy

Limit download speed (bytes per second)
$ axel [[-s|--max-speed]] [speed] [url]
copy

SYNOPSIS

axel [OPTIONS] URL [URL...]

PARAMETERS

-n num, --connections=num
    Specify the maximum number of connections to use (default is 4).

-o file, --output=file
    Save the downloaded file as file instead of its original filename from the URL.

-s num, --bytes=num
    Set the maximum download speed in bytes per second. Suffixes like 'K', 'M', 'G' are supported.

-S num, --maxspeed=num
    Alias for --bytes. Specify the maximum download speed in bytes per second.

-H header, --header=header
    Add an extra HTTP header to the request (e.g., 'Cookie: foo=bar').

-U agent, --user-agent=agent
    Specify a custom User-Agent string for HTTP requests.

-p, --no-proxy
    Do not use any configured HTTP/FTP proxy.

-x proxy, --proxy=proxy
    Use a specific proxy server for the download (e.g., 'http://proxy.example.com:8080').

-q, --quiet
    Suppress all status and progress messages during download.

-v, --verbose
    Print more detailed status messages and debug information.

-k, --insecure
    Do not verify the peer certificate for HTTPS downloads. Use with caution.

-e, --no-clobber
    Do not overwrite an existing file. If the target file exists, axel will exit.

-a, --no-auto-resume
    Do not automatically attempt to resume a download if the file already exists.

-b, --no-speed-meter
    Do not display the real-time speed meter (progress bar) during download.

-T timeout, --timeout=timeout
    Set the connection timeout in seconds.

-w num, --wait=num
    Wait num seconds between retries for failed connections.

-r num, --retries=num
    Set the number of retries for failed connections (default is 5).

-c, --continue
    Explicitly resume a broken download. This is often the default behavior if a partial file already exists.

-4, --ipv4
    Use IPv4 only for connections.

-6, --ipv6
    Use IPv6 only for connections.

-j, --json
    Output progress information in JSON format, suitable for scripting and parsing.

-h, --help
    Display help information and exit.

-V, --version
    Display version information and exit.

DESCRIPTION

axel is a lightweight Linux command-line download accelerator designed to speed up downloads by using multiple connections for a single file. It supports a variety of protocols including HTTP, HTTPS, FTP, and FTPS. A key feature of axel is its ability to automatically resume interrupted downloads, making it resilient to network outages or system restarts. Unlike traditional download tools like wget or curl which download sequentially, axel splits the file into multiple segments and downloads these parts concurrently. This parallel approach significantly reduces download times, especially beneficial on high-latency or high-bandwidth connections. It is often chosen for its efficiency, minimal resource consumption, and user-friendly command-line interface.

CAVEATS

Not all web servers fully support multiple connections via HTTP range requests. In such scenarios, axel will gracefully fall back to a single connection. While axel excels at accelerating downloads, it is more specialized than general-purpose network tools like curl or wget and may not offer their extensive features for general web interaction or complex protocol handling. Using an excessively high number of connections (via -n) might be perceived as aggressive by some servers and could potentially lead to temporary IP blocking.

RESUME CAPABILITY

axel intelligently detects existing partial files at the target destination and automatically attempts to resume the download from the point of interruption. This ensures robustness against network issues or system reboots. This auto-resume behavior can be explicitly disabled using the --no-auto-resume option if desired.

MULTIPLE URL HANDLING

Users can provide multiple URLs as arguments on the command line. axel will then process and download each URL sequentially, applying its acceleration techniques to each download in turn.

JSON PROGRESS OUTPUT

A modern feature, the -j or --json option allows axel to output its real-time progress updates in a structured JSON format. This makes it significantly easier to integrate axel's download status into scripts, graphical user interfaces, or other applications for programmatic monitoring and control.

HISTORY

axel was initially developed by Wilmer van der Gaast and first released around the year 2000. As an open-source project, it has been continuously maintained and evolved by a community of developers. Over its history, axel has integrated support for newer protocols and features like HTTPS and IPv6, while consistently adhering to its core principle of providing fast, multi-connection downloads from the command line. Its simplicity and effectiveness have cemented its place as a popular utility for accelerated downloading in Linux environments.

SEE ALSO

wget(1), curl(1), aria2c(1)

Copied to clipboard