Local LLM Prompt Manager
A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations.
Features
- Create & Manage Prompts: Store prompts in YAML format with metadata
- Tagging System: Organize prompts with tags for fast filtering
- Powerful Search: Find prompts by name, content, or tags
- Template System: Jinja2-style templating with variable substitution
- Git Integration: Generate commit messages from staged changes
- LLM Integration: Works with Ollama and LM Studio
- Import/Export: Share prompts in JSON/YAML or LLM-specific formats
Installation
Quick Start
Create a Prompt
List Prompts
Run a Prompt with Variables
Generate a Commit Message
Commands
Prompt Management
| Command |
Description |
llm-prompt prompt create <name> |
Create a new prompt |
llm-prompt prompt list |
List all prompts |
llm-prompt prompt show <name> |
Show prompt details |
llm-prompt prompt delete <name> |
Delete a prompt |
Tag Management
| Command |
Description |
llm-prompt tag list |
List all tags |
llm-prompt tag add <prompt> <tag> |
Add tag to prompt |
llm-prompt tag remove <prompt> <tag> |
Remove tag from prompt |
Search & Run
| Command |
Description |
llm-prompt search [query] |
Search prompts |
llm-prompt run <name> |
Run a prompt |
Git Integration
| Command |
Description |
llm-prompt commit |
Generate commit message |
Import/Export
| Command |
Description |
llm-prompt export <output> |
Export prompts |
llm-prompt import <input> |
Import prompts |
Configuration
| Command |
Description |
llm-prompt config show |
Show current config |
llm-prompt config set <key> <value> |
Set config value |
llm-prompt config test |
Test LLM connection |
Prompt Format
Prompts are stored as YAML files:
provider: ollama
model: llama3.2
Provide variables with --var key=value:
Configuration
Default configuration file: ~/.config/llm-prompt-manager/config.yaml
Environment Variables
| Variable |
Default |
Description |
LLM_PROMPT_DIR |
~/.config/llm-prompt-manager/prompts |
Prompt storage directory |
OLLAMA_URL |
http://localhost:11434 |
Ollama API URL |
LMSTUDIO_URL |
http://localhost:1234 |
LM Studio API URL |
DEFAULT_MODEL |
llama3.2 |
Default LLM model |
Examples
Create a Code Explainer Prompt
Export to Ollama Format
Import from Another Library
Troubleshooting
| Error |
Solution |
| Prompt not found |
Use llm-prompt list to see available prompts |
| LLM connection failed |
Ensure Ollama/LM Studio is running |
| Invalid YAML format |
Check prompt YAML has required fields |
| Template variable missing |
Provide all required --var values |
| Git repository not found |
Run from a directory with .git folder |
License
MIT