LinuxCommandLibrary

gau

Get All URLs: fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl for any domains.

TLDR

Fetch all URLs of a domain from AlienVault's Open Threat Exchange, the Wayback Machine, Common Crawl, and URLScan

$ gau [example.com]
copy


Fetch URLs of multiple domains
$ gau [domain1 domain2 ...]
copy


Fetch all URLs of several domains from an input file, running multiple threads
$ gau --threads [4] < [path/to/domains.txt]
copy


Write [o]utput results to a file
$ gau [example.com] --o [path/to/found_urls.txt]
copy


Search for URLs from only one specific provider
$ gau --providers [wayback|commoncrawl|otx|urlscan] [example.com]
copy


Search for URLs from multiple providers
$ gau --providers [wayback,otx,...] [example.com]
copy


Search for URLs within specific date range
$ gau --from [YYYYMM] --to [YYYYMM] [example.com]
copy

Copied to clipboard