LinuxCommandLibrary

http_load

Load test web servers with simulated traffic

TLDR

Emulate 20 requests based on a given URL list file per second for 60 seconds

$ http_load -rate [20] -seconds [60] [path/to/urls.txt]
copy

Emulate 5 concurrent requests based on a given URL list file for 60 seconds
$ http_load -parallel [5] -seconds [60] [path/to/urls.txt]
copy

Emulate 1000 requests at 20 requests per second, based on a given URL list file
$ http_load -rate [20] -fetches [1000] [path/to/urls.txt]
copy

Emulate 1000 requests at 5 concurrent requests at a time, based on a given URL list file
$ http_load -parallel [5] -fetches [1000] [path/to/urls.txt]
copy

SYNOPSIS

http_load [-parallel N] [-fetches N] [-timeout SECONDS] [-throttle] [-verbose] [-proxy host:port] [-ssl] file

PARAMETERS

-parallel N
    Number of concurrent connections/processes to use. Default is 1.

-fetches N
    Total number of fetches to perform. Default is 1.

-timeout SECONDS
    Maximum time to wait for a connection or request. Default is 60 seconds.

-throttle
    Throttle the requests to simulate user think time.

-verbose
    Enable verbose output.

-proxy host:port
    Use the specified HTTP proxy server.

-ssl
    Use SSL connections.

file
    File containing a list of URLs, one URL per line.

DESCRIPTION

http_load is a program designed to run HTTP load tests. It can be used to simulate multiple users accessing a web server concurrently. It takes a list of URLs as input and fetches them repeatedly, allowing you to measure the server's performance under load, including metrics like transactions per second, average response time, and number of successful and failed requests.

This tool is useful for web developers, system administrators, and anyone who needs to assess the capacity and stability of web servers. It's simple to use and provides basic but essential performance data to help identify bottlenecks and optimize web application performance. Because of its simplicity, the tool relies on a specified number of HTTP requests which are being executed concurrently from a single machine. This is a significant limitation, compared to more advanced tools such as JMeter or LoadView.

CAVEATS

http_load is a relatively basic tool and may not accurately simulate complex real-world workloads. It runs from a single machine, limiting its ability to generate extremely high loads. It is also a single threaded application, so it can not fully utilize multicore systems. It lacks advanced features such as cookie handling, authentication, and scripting capabilities found in more sophisticated load testing tools.

URL FORMAT

The input file should contain one URL per line. Empty lines and lines starting with '#' are ignored. URLs should be fully qualified, including the protocol (http:// or https://) and hostname.

OUTPUT INTERPRETATION

The output provides statistics such as the number of fetches performed, the number of successful and failed requests, the total elapsed time, and the average transaction rate (fetches per second). It's important to analyze the error rate in conjunction with the transaction rate to accurately assess the server's performance under load.

HISTORY

http_load was developed as a simple and lightweight tool for basic HTTP load testing. While its exact history is difficult to pinpoint (due to its age), it has been a commonly used utility in the Unix/Linux environment for many years. It served as a foundational tool for understanding the impact of concurrent requests on web server performance before more advanced tools became prevalent.

SEE ALSO

ab(1), siege(1), wrk(1)

Copied to clipboard