LinuxCommandLibrary

duplicity

Backup and restore directories incrementally

TLDR

Backup a directory via FTPS to a remote machine, encrypting it with a password

$ FTP_PASSWORD=[ftp_login_password] PASSPHRASE=[encryption_password] duplicity [path/to/source_directory] [ftps://user@hostname/path/to/target_directory]/
copy

Backup a directory to Amazon S3, doing a full backup every month
$ duplicity --full-if-older-than [1M] s3://[bucket_name[/prefix]]
copy

Delete versions older than 1 year from a backup stored on a WebDAV share
$ FTP_PASSWORD=[webdav_login_password] duplicity remove-older-than [1Y] --force [webdav[s]://user@hostname[:port]/some_directory]
copy

List the available backups
$ duplicity collection-status "file://[absolute/path/to/backup_directory]"
copy

List the files in a backup stored on a remote machine, via SSH
$ duplicity list-current-files [[-t|--time]] [YYYY-MM-DD] scp://[user@hostname]/[path/to/backup_directory]
copy

Restore a subdirectory from a GnuPG-encrypted local backup to a given location
$ PASSPHRASE=[gpg_key_password] duplicity restore --encrypt-key [gpg_key_id] --path-to-restore [path/to/restore_directory] file://[absolute/path/to/backup_directory] [path/to/directory_to_restore_to]
copy

SYNOPSIS

duplicity [action] [options] [source_path target_url]

PARAMETERS

--archive-dir <DIR>
    Directory for duplicity cache and metadata (default: ~/.cache/duplicity)

--backend-option <SECTION:KEY=VAL>
    Pass options to storage backend

--compare-data
    Verify data content, not just metadata

--dry-run
    Simulate actions without changes

--encrypt-key <KEYID>
    Encrypt backups to specified GnuPG public key

--exclude <pattern>
    Exclude files matching glob pattern

--exclude-filelist <FILE>
    Read exclude patterns from FILE

--exclude-filelist-stdin
    Read excludes from stdin

--exclude-if-present <FILE>
    Exclude dir if FILE exists inside

--exclude-regex <regex>
    Exclude paths matching regex

--full-if-older-than <TIME>
    Force full backup if older than TIME (e.g., 2W)

--gpg-binary <PATH>
    Path to alternate gpg executable

--hidden-encrypt-key <KEYID>
    Encrypt hidden backup chain

--include <pattern>
    Include files after exclusions

--max-blocksize <N>
    Max rsync block size in KiB (default: 2048)

--no-encryption
    Disable GPG encryption

--numretries <N>
    Retry failed operations N times (default: 5)

--progress
    Display progress during backup

--rename <old=new>
    Rename backup set

--restore-time <TIME>
    Restore from specific time (e.g., 2023-01-01)

--sign-key <KEYID>
    Sign backups with GnuPG key

--time <TIME>
    Override current time for backup

--use-agent
    Use gpg-agent for passphrase handling

--verbosity <N>
    Set verbosity level (0-9, default: 4)

DESCRIPTION

Duplicity is a versatile command-line tool for creating secure, bandwidth-efficient backups of directories across local or remote storage. It leverages the rsync algorithm via librsync to produce incremental backups, storing only changes since the last backup, while full backups capture everything. Backups are packaged in compressed tar-format volumes of configurable size, supporting gzip, bzip2, or lzma.

Security is a core feature: backups can be symmetrically or asymmetrically encrypted with GnuPG, signed for integrity verification, and protected with passphrases. Restoration is flexible, allowing recovery to any point in time using --restore-time.

Duplicity supports diverse backends like local files (file://), SSH/SCP/SFTP, FTP, WebDAV, rsync, Amazon S3, Google Drive, and more via plugins. It includes powerful exclusion/inclusion patterns, progress reporting, automatic retries, and space management commands like cleanup and remove-older-than.

Ideal for scripting and automation, it requires Python, librsync, and GnuPG. Verify backups regularly, as chains depend on sequential volumes.

CAVEATS

Backup chains are sequential; missing volumes break incrementals. Always run verify after backup. GPG setup required for encryption. Python 3+ and librsync needed. Large files may split unpredictably.

COMMON ACTIONS

full: Create full backup
inc: Incremental backup
verify [--compare-data]: Check integrity
restore [--restore-time TIME]: Restore files
collection-status: Show backup history
cleanup: Remove failed volumes
remove-older-than TIME: Prune old backups

TARGET URL FORMATS

file:///local/path
scp://user@host/path or sftp://
ftp://user@host/path
rsync://host/module/path
s3://bucket/path (needs boto3)
gs://bucket/path (Google Cloud)
More via plugins.

HISTORY

Created in 2002 by Kenneth Albanowski, inspired by rdiff-backup's librsync efficiency. Maintained by community; major rewrite in 2012 for Python 3 support. Latest stable: 3.0.x (2024), focusing on modern backends like B2 and Swift.

SEE ALSO

gpg(1), rsync(1), tar(1), rdiff-backup(1)

Copied to clipboard