LinuxCommandLibrary

f5fpc

Compile Fortran programs

TLDR

Open a new VPN connection

$ sudo f5fpc --start
copy

Open a new VPN connection to a specific host
$ sudo f5fpc --start --host [host.example.com]
copy

Specify a username (user will be prompted for a password)
$ sudo f5fpc --start --host [host.example.com] --username [user]
copy

Show the current VPN status
$ sudo f5fpc --info
copy

Shutdown the VPN connection
$ sudo f5fpc --stop
copy

SYNOPSIS

f5fpc [options] [directories...]

PARAMETERS

-r
    Recursively search directories.

-s
    Follow symbolic links.

-x
    Output in XML format.

-m
    Minimum file size (in bytes) to consider. Default is 1 byte.

-k
    Specify a keep mode.
Modes can be 'newer', 'older', 'smallest', 'largest', or 'user:'.
If user mode is selected, a path must be specified.

-p
    Path to a script to be used to determine which file(s) to keep. Can be used with the -k option.

[directories...]
    One or more directories to search for duplicate files. If no directories are specified, the current directory is used.

DESCRIPTION

f5fpc (Fast Find Fast Portable Checksum) is a command-line utility designed to efficiently identify duplicate files within specified directories. Unlike simple name-based searches, f5fpc calculates and compares checksums (specifically MD5) of files to determine if their content is identical, regardless of filename or location. This makes it useful for cleaning up disk space, identifying redundant backups, and verifying data integrity. The command supports searching multiple directories, handling symbolic links (optionally), and providing various output formats including text and XML. It's designed to be lightweight and portable, aiming for speed and efficiency in detecting file duplicates. The utility is well suited for large filesystems, because it aims to be computationally cheap (fast) as well as portable and simple to use.

CAVEATS

f5fpc relies on MD5 checksums for duplicate detection. While MD5 is generally reliable, cryptographic collisions are possible. It's unlikely to be an issue for typical file duplication scenarios, but it's a theoretical limitation. Performance can be affected by the number and size of files being analyzed. Extensive searches on large datasets can take considerable time. Also, without the `-r` option, subdirectories will not be searched.

KEEP MODES

The `-k` parameter is very powerful for automated cleanup actions. Different keep modes can automatically remove old, small, or large files.

HISTORY

The development history is not widely documented in readily available sources. It's primarily a command-line tool for finding duplicate files.

SEE ALSO

fdupes(1), md5sum(1), sha256sum(1)

Copied to clipboard