aws-firehose
Manage Amazon Data Firehose delivery streams
TLDR
List all delivery streams
SYNOPSIS
aws firehose subcommand [options]
DESCRIPTION
aws firehose is a subcommand of the AWS CLI that manages Amazon Data Firehose (formerly Kinesis Data Firehose), a service for loading streaming data into data stores and analytics services.
Firehose automatically batches, compresses, transforms, and encrypts data before delivering it to destinations like S3, Redshift, OpenSearch, Splunk, or HTTP endpoints. It handles scaling automatically without provisioning.
Data can be sent directly via put-record APIs or ingested from Kinesis Data Streams. Firehose buffers records based on size (1-128 MB) or time (60-900 seconds) before delivery.
PARAMETERS
list-delivery-streams
List all delivery streams in the account.describe-delivery-stream
Get detailed configuration of a stream.create-delivery-stream
Create a new delivery stream.delete-delivery-stream
Remove a delivery stream.put-record
Send a single data record.put-record-batch
Send multiple records in one request.update-destination
Modify destination configuration.start-delivery-stream-encryption
Enable server-side encryption.stop-delivery-stream-encryption
Disable encryption.--delivery-stream-name name
Name of the delivery stream.--record data
Single record with Data field (base64).--records records
Array of records for batch put.--s3-destination-configuration config
S3 destination settings.--redshift-destination-configuration config
Redshift destination settings.--extended-s3-destination-configuration config
S3 with data transformation settings.
CAVEATS
Record data must be base64-encoded when using the CLI. Maximum record size is 1 MB. Batch put accepts up to 500 records or 4 MB per request. Delivery streams cannot be renamed; create a new one instead. Buffer intervals cause delivery latency. Failed records to Redshift are written to S3 as a backup.
HISTORY
Amazon Kinesis Firehose launched in October 2015 as the easiest way to load streaming data into AWS. It was renamed to Amazon Data Firehose in February 2024 to reflect its broader scope beyond Kinesis integration. Over time, destinations expanded from S3 and Redshift to include OpenSearch, Splunk, and custom HTTP endpoints. Dynamic partitioning was added in 2021 to enable efficient data lake patterns.
SEE ALSO
aws(1), aws-kinesis(1), aws-s3(1), aws-redshift(1)
