Files
7000pctAUTO 589b8acf8b
Some checks failed
CI / test (push) Has been cancelled
fix: Correct CI workflow directory paths
2026-02-05 22:25:31 +00:00

211 lines
4.8 KiB
Markdown

# Local LLM Prompt Manager
A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations.
## Features
- **Create & Manage Prompts**: Store prompts in YAML format with metadata
- **Tagging System**: Organize prompts with tags for fast filtering
- **Powerful Search**: Find prompts by name, content, or tags
- **Template System**: Jinja2-style templating with variable substitution
- **Git Integration**: Generate commit messages from staged changes
- **LLM Integration**: Works with Ollama and LM Studio
- **Import/Export**: Share prompts in JSON/YAML or LLM-specific formats
## Installation
```bash
# From source
pip install -e .
# Or install dependencies manually
pip install click rich pyyaml requests jinja2
```
## Quick Start
### Create a Prompt
```bash
llm-prompt prompt create my-prompt \
--template "Explain {{ concept }} in simple terms" \
--description "Simple explanation generator" \
--tag "explanation" \
--variable "concept:What to explain:true"
```
### List Prompts
```bash
llm-prompt prompt list
llm-prompt prompt list --tag "explanation"
```
### Run a Prompt with Variables
```bash
llm-prompt run my-prompt --var concept="quantum physics"
```
### Generate a Commit Message
```bash
# Stage some changes first
git add .
# Generate commit message
llm-prompt commit
```
## Commands
### Prompt Management
| Command | Description |
|---------|-------------|
| `llm-prompt prompt create <name>` | Create a new prompt |
| `llm-prompt prompt list` | List all prompts |
| `llm-prompt prompt show <name>` | Show prompt details |
| `llm-prompt prompt delete <name>` | Delete a prompt |
### Tag Management
| Command | Description |
|---------|-------------|
| `llm-prompt tag list` | List all tags |
| `llm-prompt tag add <prompt> <tag>` | Add tag to prompt |
| `llm-prompt tag remove <prompt> <tag>` | Remove tag from prompt |
### Search & Run
| Command | Description |
|---------|-------------|
| `llm-prompt search [query]` | Search prompts |
| `llm-prompt run <name>` | Run a prompt |
### Git Integration
| Command | Description |
|---------|-------------|
| `llm-prompt commit` | Generate commit message |
### Import/Export
| Command | Description |
|---------|-------------|
| `llm-prompt export <output>` | Export prompts |
| `llm-prompt import <input>` | Import prompts |
### Configuration
| Command | Description |
|---------|-------------|
| `llm-prompt config show` | Show current config |
| `llm-prompt config set <key> <value>` | Set config value |
| `llm-prompt config test` | Test LLM connection |
## Prompt Format
Prompts are stored as YAML files:
```yaml
name: explain-code
description: Explain code in simple terms
tags: [documentation, beginner]
variables:
- name: code
description: The code to explain
required: true
- name: level
description: Explanation level (beginner/intermediate/advanced)
required: false
default: beginner
template: |
Explain the following code in {{ level }} terms:
```python
{{ code }}
```
provider: ollama
model: llama3.2
```
## Template System
Use Jinja2 syntax for variable substitution:
```jinja2
Hello {{ name }}!
Your task: {{ task }}
{% for item in items %}
- {{ item }}
{% endfor %}
```
Provide variables with `--var key=value`:
```bash
llm-prompt run my-prompt --var name="Alice" --var task="review code"
```
## Configuration
Default configuration file: `~/.config/llm-prompt-manager/config.yaml`
```yaml
prompt_dir: ~/.config/llm-prompt-manager/prompts
ollama_url: http://localhost:11434
lmstudio_url: http://localhost:1234
default_model: llama3.2
default_provider: ollama
```
## Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| `LLM_PROMPT_DIR` | `~/.config/llm-prompt-manager/prompts` | Prompt storage directory |
| `OLLAMA_URL` | `http://localhost:11434` | Ollama API URL |
| `LMSTUDIO_URL` | `http://localhost:1234` | LM Studio API URL |
| `DEFAULT_MODEL` | `llama3.2` | Default LLM model |
## Examples
### Create a Code Explainer Prompt
```bash
llm-prompt prompt create explain-code \
--template "Explain this code: {{ code }}" \
--description "Explain code in simple terms" \
--tag "docs" \
--variable "code:The code to explain:true"
```
### Export to Ollama Format
```bash
llm-prompt export my-prompts.yaml --format ollama
```
### Import from Another Library
```bash
llm-prompt import /path/to/prompts.json --force
```
## Troubleshooting
| Error | Solution |
|-------|----------|
| Prompt not found | Use `llm-prompt list` to see available prompts |
| LLM connection failed | Ensure Ollama/LM Studio is running |
| Invalid YAML format | Check prompt YAML has required fields |
| Template variable missing | Provide all required `--var` values |
| Git repository not found | Run from a directory with `.git` folder |
## License
MIT