7000pctAUTO bffb8d1c8a
Some checks failed
CI / test (push) Has been cancelled
CI / lint (push) Has been cancelled
CI / build (push) Has been cancelled
Initial upload: Local LLM Prompt Manager CLI tool
2026-02-05 20:56:04 +00:00

Local LLM Prompt Manager

CI Version Python License

A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations. Features include creating prompt libraries, tagging, search, templating, and generating commit messages. Works with Ollama, LM Studio, and other local LLM runners.

Features

  • Create & Manage Prompts: Store prompts in YAML format with metadata
  • Tagging System: Organize prompts with tags for fast filtering
  • Powerful Search: Find prompts by name, content, or tags
  • Template System: Jinja2-style templating with variable substitution
  • Git Integration: Generate commit messages from staged changes
  • LLM Integration: Works with Ollama and LM Studio
  • Import/Export: Share prompts in JSON/YAML or LLM-specific formats

Installation

# Install from PyPI (coming soon)
pip install local-llm-prompt-manager

# From source
git clone https://7000pct.gitea.bloupla.net/7000pctAUTO/local-llm-prompt-manager.git
cd local-llm-prompt-manager
pip install -e .

# Or install dependencies manually
pip install click rich pyyaml requests jinja2

Quick Start

Create a Prompt

llm-prompt prompt create my-prompt \
  --template "Explain {{ concept }} in simple terms" \
  --description "Simple explanation generator" \
  --tag "explanation" \
  --variable "concept:What to explain:true"

List Prompts

llm-prompt prompt list
llm-prompt prompt list --tag "explanation"

Run a Prompt with Variables

llm-prompt run my-prompt --var concept="quantum physics"

Generate a Commit Message

# Stage some changes first
git add .

# Generate commit message
llm-prompt commit

Commands

Prompt Management

Command Description
llm-prompt prompt create <name> Create a new prompt
llm-prompt prompt list List all prompts
llm-prompt prompt show <name> Show prompt details
llm-prompt prompt delete <name> Delete a prompt

Tag Management

Command Description
llm-prompt tag list List all tags
llm-prompt tag add <prompt> <tag> Add tag to prompt
llm-prompt tag remove <prompt> <tag> Remove tag from prompt

Search & Run

Command Description
llm-prompt search [query] Search prompts
llm-prompt run <name> Run a prompt

Git Integration

Command Description
llm-prompt commit Generate commit message

Import/Export

Command Description
llm-prompt export <output> Export prompts
llm-prompt import <input> Import prompts

Configuration

Command Description
llm-prompt config show Show current config
llm-prompt config set <key> <value> Set config value
llm-prompt config test Test LLM connection

Prompt Format

Prompts are stored as YAML files:

name: explain-code
description: Explain code in simple terms
tags: [documentation, beginner]
variables:
  - name: code
    description: The code to explain
    required: true
  - name: level
    description: Explanation level (beginner/intermediate/advanced)
    required: false
    default: beginner
template: |
  Explain the following code in {{ level }} terms:

  ```python
  {{ code }}

provider: ollama model: llama3.2


## Template System

Use Jinja2 syntax for variable substitution:

```jinja2
Hello {{ name }}!
Your task: {{ task }}

{% for item in items %}
- {{ item }}
{% endfor %}

Provide variables with --var key=value:

llm-prompt run my-prompt --var name="Alice" --var task="review code"

Configuration

Default configuration file: ~/.config/llm-prompt-manager/config.yaml

prompt_dir: ~/.config/llm-prompt-manager/prompts
ollama_url: http://localhost:11434
lmstudio_url: http://localhost:1234
default_model: llama3.2
default_provider: ollama

Environment Variables

Variable Default Description
LLM_PROMPT_DIR ~/.config/llm-prompt-manager/prompts Prompt storage directory
OLLAMA_URL http://localhost:11434 Ollama API URL
LMSTUDIO_URL http://localhost:1234 LM Studio API URL
DEFAULT_MODEL llama3.2 Default LLM model

Examples

Create a Code Explainer Prompt

llm-prompt prompt create explain-code \
  --template "Explain this code: {{ code }}" \
  --description "Explain code in simple terms" \
  --tag "docs" \
  --variable "code:The code to explain:true"

Export to Ollama Format

llm-prompt export my-prompts.yaml --format ollama

Import from Another Library

llm-prompt import /path/to/prompts.json --force

Testing

# Run all tests
pytest tests/ -v

# Run with coverage
pytest tests/ --cov=src

# Run specific test file
pytest tests/test_cli.py -v

Troubleshooting

Error Solution
Prompt not found Use llm-prompt list to see available prompts
LLM connection failed Ensure Ollama/LM Studio is running
Invalid YAML format Check prompt YAML has required fields
Template variable missing Provide all required --var values
Git repository not found Run from a directory with .git folder

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

MIT License - see LICENSE file for details.

Description
A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations. Features include creating prompt libraries, tagging, search, templating, and generating commit messages.
Readme MIT 404 KiB
v0.1.0 Latest
2026-02-05 20:56:40 +00:00
Languages
Python 85.5%
Rust 7.8%
HTML 6.3%
Jinja 0.3%
Dockerfile 0.1%