rgpt
Repair GPT disk partition tables
TLDR
Ask GPT to improve the code with no extra options
Get a more detailed verbose output from rgpt while reviewing the code
Ask GPT to improve the code and limit it to a certain amount of GPT3 tokens
Ask GPT for a more unique result using a float value between 0 and 2. (higher = more unique)
Ask GPT to review your code using a specific model
Make rgpt use a JSON output
SYNOPSIS
rgpt [OPTIONS] [PROMPT | -f FILE]
PARAMETERS
-p, --prompt TEXT
Specifies the input prompt directly on the command line. Enclose in quotes for multiple words.
-f, --file PATH
Reads the input prompt from the specified file PATH. Useful for longer prompts or scripts.
-m, --model NAME
Selects the AI model to use for the response. Examples: 'gpt-4', 'llama2'. Defaults to a configured model.
-s, --stream
Streams the generated response as it becomes available, similar to how content appears in web UIs.
-c, --continue
Continues the last conversation session. rgpt maintains a history or session context to enable multi-turn dialogues.
-t, --temperature VALUE
Sets the creativity/randomness of the model's output, a float between 0.0 and 1.0 (default 0.7). Lower values are more deterministic.
-o, --output FILE
Writes the generated response to the specified FILE instead of standard output.
-h, --help
Displays a help message and exits.
DESCRIPTION
The rgpt command provides a powerful and flexible command-line interface for interacting with various large language models (LLMs), including GPT-like models. It allows users to send natural language prompts and receive generated responses directly in their terminal, facilitating quick queries, content generation, and integration into shell scripts.
rgpt is designed for developers, researchers, and anyone looking to leverage AI capabilities without leaving their command-line environment. It supports features like specifying different models, managing conversation context, and streaming responses for real-time interaction. Its primary goal is to democratize access to AI by making it a first-class citizen of the Unix philosophy.
CAVEATS
The rgpt command typically requires an active internet connection to communicate with remote AI services.
Users must configure necessary API keys or authentication tokens, often via environment variables or a configuration file, which may incur costs based on API usage.
As with all LLMs, responses might occasionally be inaccurate, nonsensical, or reflect biases present in training data.
CONFIGURATION
rgpt typically relies on a configuration file (e.g., ~/.config/rgpt/config.json) or environment variables (RGPT_API_KEY, RGPT_DEFAULT_MODEL) for setting up API keys, default models, and other preferences. This allows for persistent settings without needing to specify them with every command.
SCRIPTING USAGE
Due to its command-line nature, rgpt is ideal for scripting. It can be easily piped with other standard Unix commands. For example, you can feed output from grep into rgpt for analysis, or use rgpt's output as input for further text processing with sed or awk.
Example: cat my_log.txt | rgpt -p 'Summarize this log file:'
HISTORY
While not a traditional Unix utility, the concept of rgpt emerged from the growing need for direct command-line interaction with large language models. Its development began in late 2022/early 2023, following the widespread adoption of GPT-style AIs. It's often a custom script or a community-driven open-source project aimed at integrating AI workflows into existing developer toolchains, offering a lightweight alternative to web interfaces for rapid prototyping and scripting.