LinuxCommandLibrary

kcat

Consume and produce Kafka messages

TLDR

Consume messages starting with the newest offset

$ kcat -C -t [topic] -b [brokers]
copy

Consume messages starting with the oldest offset and exit after the last message is received
$ kcat -C -t [topic] -b [brokers] -o beginning -e
copy

Consume messages as a Kafka consumer group
$ kcat -G [group_id] [topic] -b [brokers]
copy

Publish message by reading from stdin
$ echo [message] | kcat -P -t [topic] -b [brokers]
copy

Publish messages by reading from a file
$ kcat -P -t [topic] -b [brokers] [path/to/file]
copy

List metadata for all topics and brokers
$ kcat -L -b [brokers]
copy

List metadata for a specific topic
$ kcat -L -t [topic] -b [brokers]
copy

Get offset for a topic/partition for a specific point in time
$ kcat -Q -t [topic]:[partition]:[unix_timestamp] -b [brokers]
copy

SYNOPSIS

kcat [ -C | -P ] [ -b brokers ] [ -t topic ] [ options ]

PARAMETERS

-b, --broker-list <brokers>
    Bootstrap broker(s) (host:port), comma-separated. Required unless configured otherwise.

-t, --topic <topic>
    Topic to consume/produce. Supports regex with -T.

-C, --consumer
    Consumer mode (default without -P).

-P, --producer
    Producer mode.

-p, --partition <part>
    Single partition to use (-1 for random).

-o, --offset <offset>
    Consumer start offset: beginning, end, tail, +/-N, or timestamp.

-c, --count <N>
    Produce/consume N messages then exit (0=unlimited).

-e, --consumer-exit-eof
    Consumer: exit on EOF (all partitions caught up).

-L, --list
    List brokers and topics (metadata dump).

-D, --dump-messages
    Dump full message details (headers, keys, etc.).

-f, --format <format>
    Output format string (printf-style, %s=msg, %k=key, etc.).

-F, --config-file <file>
    Load librdkafka config from file.

-X <prop>=<value>
    Set librdkafka property (e.g., -X sasl.mechanism=PLAIN).

-T
    Consumer: topic regex matching.

-Q
    Admin: query cluster metadata.

-S, --schema-registry <url>
    Schema Registry URL for Avro/etc.

-u, --unbuffered
    Disable buffering for real-time output.

DESCRIPTION

kcat (formerly kafkacat) is a free, open-source command-line non-JVM Kafka client for Apache Kafka. It supports producing messages to topics, consuming from topics with flexible offset control, listing brokers/topics, metadata inspection, transactional APIs, and administrative operations like describe-config.

Key features include high performance via the librdkafka C library, support for Avro/Protobuf/JSON Schema serialization, SASL/SSL authentication, regex topic matching, message headers, exactly-once semantics, and compatibility with Kafka 0.8+ brokers. It reads from stdin or files for production and outputs to stdout with customizable formatting.

Unlike Kafka's console tools, kcat is lightweight, highly configurable via librdkafka properties, and ideal for scripting, testing, debugging, and benchmarking Kafka clusters. No JVM required, making it suitable for resource-constrained environments.

CAVEATS

Requires librdkafka library. Not an official Apache Kafka tool; behavior depends on librdkafka version. High-throughput use needs tuned config (e.g., batching). No built-in offset commit in non-interactive mode unless specified.

COMMON EXAMPLES

Consume: kcat -b localhost:9092 -t mytopic
Produce from stdin: echo 'hello' | kcat -b localhost:9092 -P -t mytopic
List topics: kcat -b localhost:9092 -L

FORMATS

Supports %s (payload), %k (key), %h (headers), %t (timestamp), %p (partition), %o (offset). Use -f '%o %p %k %s\n' for tabular output.

HISTORY

Developed by Magnus Edenhill as kafkacat ~2013 for librdkafka testing. Renamed to kcat in librdkafka 1.7.0 (2021) to avoid conflicts. Actively maintained, now widely used for Kafka ops, debugging, and CI/CD pipelines.

SEE ALSO

kafka-console-producer(1), kafka-console-consumer(1), kafka-topics(1), librdkafka(3)

Copied to clipboard