242 lines
6.1 KiB
Markdown
242 lines
6.1 KiB
Markdown
# Local LLM Prompt Manager
|
|
|
|
[](https://7000pct.gitea.bloupla.net/7000pctAUTO/local-llm-prompt-manager/actions)
|
|
[](https://pypi.org/project/local-llm-prompt-manager/)
|
|
[](https://pypi.org/project/local-llm-prompt-manager/)
|
|
[](https://opensource.org/licenses/MIT)
|
|
|
|
A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations. Features include creating prompt libraries, tagging, search, templating, and generating commit messages. Works with Ollama, LM Studio, and other local LLM runners.
|
|
|
|
## Features
|
|
|
|
- **Create & Manage Prompts**: Store prompts in YAML format with metadata
|
|
- **Tagging System**: Organize prompts with tags for fast filtering
|
|
- **Powerful Search**: Find prompts by name, content, or tags
|
|
- **Template System**: Jinja2-style templating with variable substitution
|
|
- **Git Integration**: Generate commit messages from staged changes
|
|
- **LLM Integration**: Works with Ollama and LM Studio
|
|
- **Import/Export**: Share prompts in JSON/YAML or LLM-specific formats
|
|
|
|
## Installation
|
|
|
|
```bash
|
|
# Install from PyPI (coming soon)
|
|
pip install local-llm-prompt-manager
|
|
|
|
# From source
|
|
git clone https://7000pct.gitea.bloupla.net/7000pctAUTO/local-llm-prompt-manager.git
|
|
cd local-llm-prompt-manager
|
|
pip install -e .
|
|
|
|
# Or install dependencies manually
|
|
pip install click rich pyyaml requests jinja2
|
|
```
|
|
|
|
## Quick Start
|
|
|
|
### Create a Prompt
|
|
|
|
```bash
|
|
llm-prompt prompt create my-prompt \
|
|
--template "Explain {{ concept }} in simple terms" \
|
|
--description "Simple explanation generator" \
|
|
--tag "explanation" \
|
|
--variable "concept:What to explain:true"
|
|
```
|
|
|
|
### List Prompts
|
|
|
|
```bash
|
|
llm-prompt prompt list
|
|
llm-prompt prompt list --tag "explanation"
|
|
```
|
|
|
|
### Run a Prompt with Variables
|
|
|
|
```bash
|
|
llm-prompt run my-prompt --var concept="quantum physics"
|
|
```
|
|
|
|
### Generate a Commit Message
|
|
|
|
```bash
|
|
# Stage some changes first
|
|
git add .
|
|
|
|
# Generate commit message
|
|
llm-prompt commit
|
|
```
|
|
|
|
## Commands
|
|
|
|
### Prompt Management
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt prompt create <name>` | Create a new prompt |
|
|
| `llm-prompt prompt list` | List all prompts |
|
|
| `llm-prompt prompt show <name>` | Show prompt details |
|
|
| `llm-prompt prompt delete <name>` | Delete a prompt |
|
|
|
|
### Tag Management
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt tag list` | List all tags |
|
|
| `llm-prompt tag add <prompt> <tag>` | Add tag to prompt |
|
|
| `llm-prompt tag remove <prompt> <tag>` | Remove tag from prompt |
|
|
|
|
### Search & Run
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt search [query]` | Search prompts |
|
|
| `llm-prompt run <name>` | Run a prompt |
|
|
|
|
### Git Integration
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt commit` | Generate commit message |
|
|
|
|
### Import/Export
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt export <output>` | Export prompts |
|
|
| `llm-prompt import <input>` | Import prompts |
|
|
|
|
### Configuration
|
|
|
|
| Command | Description |
|
|
|---------|-------------|
|
|
| `llm-prompt config show` | Show current config |
|
|
| `llm-prompt config set <key> <value>` | Set config value |
|
|
| `llm-prompt config test` | Test LLM connection |
|
|
|
|
## Prompt Format
|
|
|
|
Prompts are stored as YAML files:
|
|
|
|
```yaml
|
|
name: explain-code
|
|
description: Explain code in simple terms
|
|
tags: [documentation, beginner]
|
|
variables:
|
|
- name: code
|
|
description: The code to explain
|
|
required: true
|
|
- name: level
|
|
description: Explanation level (beginner/intermediate/advanced)
|
|
required: false
|
|
default: beginner
|
|
template: |
|
|
Explain the following code in {{ level }} terms:
|
|
|
|
```python
|
|
{{ code }}
|
|
```
|
|
|
|
provider: ollama
|
|
model: llama3.2
|
|
```
|
|
|
|
## Template System
|
|
|
|
Use Jinja2 syntax for variable substitution:
|
|
|
|
```jinja2
|
|
Hello {{ name }}!
|
|
Your task: {{ task }}
|
|
|
|
{% for item in items %}
|
|
- {{ item }}
|
|
{% endfor %}
|
|
```
|
|
|
|
Provide variables with `--var key=value`:
|
|
|
|
```bash
|
|
llm-prompt run my-prompt --var name="Alice" --var task="review code"
|
|
```
|
|
|
|
## Configuration
|
|
|
|
Default configuration file: `~/.config/llm-prompt-manager/config.yaml`
|
|
|
|
```yaml
|
|
prompt_dir: ~/.config/llm-prompt-manager/prompts
|
|
ollama_url: http://localhost:11434
|
|
lmstudio_url: http://localhost:1234
|
|
default_model: llama3.2
|
|
default_provider: ollama
|
|
```
|
|
|
|
## Environment Variables
|
|
|
|
| Variable | Default | Description |
|
|
|----------|---------|-------------|
|
|
| `LLM_PROMPT_DIR` | `~/.config/llm-prompt-manager/prompts` | Prompt storage directory |
|
|
| `OLLAMA_URL` | `http://localhost:11434` | Ollama API URL |
|
|
| `LMSTUDIO_URL` | `http://localhost:1234` | LM Studio API URL |
|
|
| `DEFAULT_MODEL` | `llama3.2` | Default LLM model |
|
|
|
|
## Examples
|
|
|
|
### Create a Code Explainer Prompt
|
|
|
|
```bash
|
|
llm-prompt prompt create explain-code \
|
|
--template "Explain this code: {{ code }}" \
|
|
--description "Explain code in simple terms" \
|
|
--tag "docs" \
|
|
--variable "code:The code to explain:true"
|
|
```
|
|
|
|
### Export to Ollama Format
|
|
|
|
```bash
|
|
llm-prompt export my-prompts.yaml --format ollama
|
|
```
|
|
|
|
### Import from Another Library
|
|
|
|
```bash
|
|
llm-prompt import /path/to/prompts.json --force
|
|
```
|
|
|
|
## Testing
|
|
|
|
```bash
|
|
# Run all tests
|
|
pytest tests/ -v
|
|
|
|
# Run with coverage
|
|
pytest tests/ --cov=src
|
|
|
|
# Run specific test file
|
|
pytest tests/test_cli.py -v
|
|
```
|
|
|
|
## Troubleshooting
|
|
|
|
| Error | Solution |
|
|
|-------|----------|
|
|
| Prompt not found | Use `llm-prompt list` to see available prompts |
|
|
| LLM connection failed | Ensure Ollama/LM Studio is running |
|
|
| Invalid YAML format | Check prompt YAML has required fields |
|
|
| Template variable missing | Provide all required `--var` values |
|
|
| Git repository not found | Run from a directory with `.git` folder |
|
|
|
|
## Contributing
|
|
|
|
1. Fork the repository
|
|
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
|
3. Commit your changes (`git commit -m 'Add amazing feature'`)
|
|
4. Push to the branch (`git push origin feature/amazing-feature`)
|
|
5. Open a Pull Request
|
|
|
|
## License
|
|
|
|
MIT License - see [LICENSE](LICENSE) file for details.
|