linkchecker
validates links in websites and documents
TLDR
Check website links
$ linkchecker [https://example.com]
Check local HTML file$ linkchecker [index.html]
Recursive check$ linkchecker -r [https://example.com]
Output to file$ linkchecker -o html -F html/[report.html] [url]
Check external links too$ linkchecker --check-extern [url]
Limit depth$ linkchecker -r --depth=[3] [url]
SYNOPSIS
linkchecker [options] url
DESCRIPTION
linkchecker validates links in websites and documents. It finds broken links, redirects, and errors.
The tool supports HTTP, HTTPS, FTP, and local files. It can check recursively and generate reports.
PARAMETERS
URL
URL or file to check.-r
Recursive checking.--depth N
Maximum recursion depth.--check-extern
Check external links.-o TYPE
Output type (text, html, csv, xml).-F FILE
Output to file.--help
Display help information.
CAVEATS
Can be slow on large sites. May trigger rate limiting. Respects robots.txt by default.
HISTORY
LinkChecker was created by Bastian Kleineidam as a comprehensive link validation tool for web content.
