yek
Serialize code repositories for LLM consumption
TLDR
Serialize current directory for LLM consumption
SYNOPSIS
yek [options] [input-paths...]
DESCRIPTION
yek (Farsi for "one") is a fast Rust-based CLI tool that serializes code repositories into text optimized for LLM (Large Language Model) consumption. It combines files into a single output with intelligent ordering and automatic filtering.
The tool respects .gitignore rules, uses Git history to prioritize important files, and automatically skips binary and large files. Output can be split into chunks based on token count or byte size.
When output is piped, yek automatically streams content instead of writing to files. This enables workflows like yek | pbcopy to quickly copy a codebase to clipboard for pasting into an LLM chat.
Configuration can be stored in yek.toml or yek.yaml files for project-specific settings.
PARAMETERS
--tokens count
Limit output by approximate token count (e.g., 128k, 100)--max-size size
Limit output by byte size (e.g., 10MB, 128K); default: 10MB--json
Output results in JSON format--output-dir path
Directory for output files; uses temp directory if not specified--output-name name
Filename written to current directory--output-template template
Custom format using FILEPATH and FILECONTENT placeholders--ignore-patterns patterns
Additional patterns to ignore (extends .gitignore)--unignore-patterns patterns
Override built-in ignore rules--line-numbers
Include line numbers in output-t, --tree-header
Include directory tree at the start of output--tree-only
Show only directory structure without file contents--no-config
Skip loading configuration files--config-file path
Use specific configuration file--debug
Enable debug logging
CAVEATS
Token counting is approximate and may vary from actual LLM tokenization. Very large repositories may need chunking with --tokens or --max-size. Glob patterns must be quoted to prevent shell expansion.
HISTORY
yek was created by Mohsen Azimi as a high-performance tool for preparing code for LLM analysis. Written in Rust, it achieved significant speed improvements over similar tools—benchmarks show it running 230× faster than alternatives like Repomix. The name means "one" in Farsi (یک), reflecting its purpose of combining files into one output.
