LinuxCommandLibrary

ffuf

Fuzz web servers for resources

TLDR

Enumerate directories using [c]olored output and a [w]ordlist specifying a target [u]RL

$ ffuf -c -w [path/to/wordlist.txt] -u [http://example.com/FUZZ]
copy

Enumerate webservers of subdomains by changing the position of the keyword
$ ffuf -w [path/to/subdomains.txt] -u [http://FUZZ.example.com]
copy

Fuzz with specified [t]hreads (default: 40) and pro[x]ying the traffic and save [o]utput to a file
$ ffuf -o -w [path/to/wordlist.txt] -u [http://example.com/FUZZ] -t [500] -x [http://127.0.0.1:8080]
copy

Fuzz a specific [H]eader ("Name: Value") and [m]atch HTTP status [c]odes
$ ffuf -w [path/to/wordlist.txt] -u [http://example.com] -H "[Host: FUZZ]" -mc [200]
copy

Fuzz with specified HTTP method and [d]ata, while [f]iltering out comma separated status [c]odes
$ ffuf -w [path/to/postdata.txt] -X [POST] -d "[username=admin\&password=FUZZ]" -u [http://example.com/login.php] -fc [401,403]
copy

Fuzz multiple positions with multiple wordlists using different modes
$ ffuf -w [path/to/keys:KEY] -w [path/to/values:VALUE] -mode [pitchfork|clusterbomb] -u [http://example.com/id?KEY=VALUE]
copy

Proxy requests through a HTTP MITM pro[x]y (such as Burp Suite or mitmproxy)
$ ffuf -w [path/to/wordlist] -x [http://127.0.0.1:8080] -u [http://example.com/FUZZ]
copy

SYNOPSIS

ffuf -u URL -w WORDLIST [options]

PARAMETERS

-u URL
    Target URL. Use the FUZZ keyword to specify injection points for wordlist entries (e.g., http://example.com/FUZZ).

-w WORDLIST[:KEYWORD]
    Path to the wordlist file. Optional :KEYWORD allows specifying a custom keyword for multiple wordlists (e.g., -w users.txt:USER -w pass.txt:PASS).

-mc CODES
    Match HTTP status codes. Comma-separated list (e.g., 200,301). Supports ranges like 200-299.

-fc CODES
    Filter HTTP status codes. Comma-separated list (e.g., 404,403).

-ml LENGTH
    Match content length. Comma-separated list (e.g., 10,20).

-fl LENGTH
    Filter content length. Comma-separated list (e.g., 123).

-t THREADS
    Number of concurrent threads (default is 40). Higher values increase speed but also resource consumption.

-H HEADER
    Add a custom HTTP header (e.g., "Authorization: Bearer token"). Use FUZZ for fuzzing header values.

-X METHOD
    HTTP method to use (e.g., GET, POST, HEAD, PUT).

-d DATA
    POST request data. Use FUZZ for injecting wordlist entries into the request body.

-recursion
    Follow redirects and recurse into discovered directories.

-rate N
    Limit requests per second to N. Useful for avoiding rate limits or reducing server load.

-v
    Verbose output, showing full requests and responses.

-p PROXYURL
    Use an HTTP/SOCKS proxy (e.g., http://127.0.0.1:8080 or socks5://localhost:9050).

-request REQUESTFILE
    Use an HTTP request file (e.g., captured from a proxy) as a template for requests. Requires FUZZ keyword in the file.

-seclimit N
    Stop fuzzing when N consecutive responses have the same size/words/lines. Useful for bypassing WAFs that block after repeated patterns.

DESCRIPTION

ffuf (Fast Fuzz User Friendly) is a powerful, Go-based web fuzzer designed for speed and flexibility in web application penetration testing. It excels at discovering hidden files, directories, virtual hosts, and parameters by performing highly concurrent HTTP requests against a target. Its core strength lies in its ability to inject fuzzed payloads into various parts of an HTTP request (URL path, query parameters, headers, POST data) and then filter/match responses based on status codes, content length, words, lines, or even regular expressions. This makes it an indispensable tool for recon, vulnerability discovery, and validating web application security. It supports multiple input sources (wordlists), custom request methods, proxies, recursion, and more, making it adaptable to complex scenarios.

CAVEATS

Ethical Use: Only use ffuf on systems for which you have explicit permission to test. Unauthorized scanning can be illegal and lead to serious consequences.

Resource Consumption: High concurrency (-t option) can lead to significant network traffic and impose a heavy load on the target server, potentially causing denial-of-service.

WAF Detection: Web Application Firewalls (WAFs) often detect and block automated scanning tools like ffuf. Techniques like rate limiting (-rate), custom headers, or specific payload encoding may be necessary to bypass them.

False Positives: Careful filtering and matching are crucial to avoid misleading results, especially when fuzzing against large wordlists or dynamic content.

COMMON USE CASES

Directory and File Enumeration: Discover hidden web paths and files (e.g., ffuf -w wordlist.txt -u http://example.com/FUZZ).

Virtual Host Discovery: Enumerate virtual hosts using the Host header (e.g., ffuf -w subdomains.txt -u http://example.com -H "Host: FUZZ.example.com").

Parameter Fuzzing: Discover hidden or undocumented parameters for endpoints (e.g., ffuf -w params.txt -u http://example.com/search?FUZZ=test).

API Endpoint Discovery: Identify and test undisclosed API paths and parameters for modern web applications.

THE FUZZ KEYWORD

ffuf uses the special keyword FUZZ to indicate where wordlist entries should be injected into the target URL, HTTP headers, or POST data. This keyword is case-sensitive and must be present for ffuf to perform its fuzzing operation. Multiple FUZZ keywords can be used in conjunction with multiple -w flags (e.g., -w users.txt:USER -w pass.txt:PASS -u http://example.com/login -d "user=USER&pass=PASS") for combinatorial fuzzing.

RESPONSE FILTERING AND MATCHING

ffuf offers robust options for filtering out unwanted responses and matching desired ones, which is critical for accurate results. Parameters like -mc (match status codes), -ml (match content length), -mw (match word count), -ms (match line count), and -mr (match regex) help to narrow down the output to relevant findings. Corresponding -fc, -fl, -fw, -fs, -fr options are available for filtering out responses that meet certain criteria (e.g., common 404 pages).

HISTORY

ffuf was created by Jaanus Kääp (@joohoi) and first released in 2019. It rapidly gained widespread adoption within the cybersecurity community due to its exceptional speed, flexibility, and intuitive command-line interface. Its development remains active, with continuous improvements and new features being added, solidifying its position as a preferred tool for web enumeration and vulnerability discovery.

SEE ALSO

gobuster(1), dirb(1), wfuzz(1), curl(1), nmap(1)

Copied to clipboard