check links in web documents or full websites


Keywords
checking, crawling, link, site, url, validation, verification, http, link-checker, tools, web, www
Licenses
GPL-3.0/GPL-3.0+
Install
pip install LinkChecker==9.3

Documentation

LinkChecker

Build Status License

Check for broken links in web sites.

Features

  • recursive and multithreaded checking and site crawling
  • output in colored or normal text, HTML, SQL, CSV, XML or a sitemap graph in different formats
  • HTTP/1.1, HTTPS, FTP, mailto: and local file links support
  • restrict link checking with regular expression filters for URLs
  • proxy support
  • username/password authorization for HTTP and FTP
  • honors robots.txt exclusion protocol
  • Cookie support
  • HTML5 support
  • a command line and web interface
  • various check plugins available

Installation

Python 3.9 or later is needed. Using pip to install LinkChecker:

pip3 install linkchecker

pipx can also be used to install LinkChecker.

The version in the pip repository may be old, to find out how to get the latest code, plus platform-specific information and other advice see doc/install.txt in the source code archive.

Usage

Execute linkchecker https://www.example.com. For other options see linkchecker --help, and for more information the manual pages linkchecker(1) and linkcheckerrc(5).

Docker usage

If you do not want to install any additional libraries/dependencies you can use the Docker image which is published on GitHub Packages.

Example for external web site check:

docker run --rm -it -u $(id -u):$(id -g) ghcr.io/linkchecker/linkchecker:latest --verbose https://www.example.com

Local HTML file check:

docker run --rm -it -u $(id -u):$(id -g) -v "$PWD":/mnt ghcr.io/linkchecker/linkchecker:latest --verbose index.html

In addition to the rolling latest image, uniquely tagged images can also be found on the packages page.