LinuxCommandLibrary

linkchecker

validates links in websites and documents

TLDR

Check website links

$ linkchecker [https://example.com]
copy
Check local HTML file
$ linkchecker [index.html]
copy
Recursive check
$ linkchecker -r [https://example.com]
copy
Output to file
$ linkchecker -o html -F html/[report.html] [url]
copy
Check external links too
$ linkchecker --check-extern [url]
copy
Limit depth
$ linkchecker -r --depth=[3] [url]
copy

SYNOPSIS

linkchecker [options] url

DESCRIPTION

linkchecker validates links in websites and documents. It finds broken links, redirects, and errors.
The tool supports HTTP, HTTPS, FTP, and local files. It can check recursively and generate reports.

PARAMETERS

URL

URL or file to check.
-r
Recursive checking.
--depth N
Maximum recursion depth.
--check-extern
Check external links.
-o TYPE
Output type (text, html, csv, xml).
-F FILE
Output to file.
--help
Display help information.

CAVEATS

Can be slow on large sites. May trigger rate limiting. Respects robots.txt by default.

HISTORY

LinkChecker was created by Bastian Kleineidam as a comprehensive link validation tool for web content.

SEE ALSO

wget(1), curl(1), lychee(1)

> TERMINAL_GEAR

Curated for the Linux community

Copied to clipboard

> TERMINAL_GEAR

Curated for the Linux community