This commit is contained in:
377
README.md
377
README.md
@@ -1,273 +1,210 @@
|
|||||||
# MCP Server CLI
|
# Local LLM Prompt Manager
|
||||||
|
|
||||||
A CLI tool that creates a local Model Context Protocol (MCP) server for developers, enabling custom tool definitions in YAML/JSON with built-in file operations, git commands, shell execution, and local LLM support for offline AI coding assistant integration.
|
A CLI tool for developers to manage, organize, version control, and share local LLM prompts and configurations.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Create & Manage Prompts**: Store prompts in YAML format with metadata
|
||||||
|
- **Tagging System**: Organize prompts with tags for fast filtering
|
||||||
|
- **Powerful Search**: Find prompts by name, content, or tags
|
||||||
|
- **Template System**: Jinja2-style templating with variable substitution
|
||||||
|
- **Git Integration**: Generate commit messages from staged changes
|
||||||
|
- **LLM Integration**: Works with Ollama and LM Studio
|
||||||
|
- **Import/Export**: Share prompts in JSON/YAML or LLM-specific formats
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
# From source
|
||||||
pip install -e .
|
pip install -e .
|
||||||
```
|
|
||||||
|
|
||||||
Or from source:
|
# Or install dependencies manually
|
||||||
|
pip install click rich pyyaml requests jinja2
|
||||||
```bash
|
|
||||||
git clone <repository>
|
|
||||||
cd mcp-server-cli
|
|
||||||
pip install -e .
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
1. Initialize a configuration file:
|
### Create a Prompt
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
mcp-server config init -o config.yaml
|
llm-prompt prompt create my-prompt \
|
||||||
|
--template "Explain {{ concept }} in simple terms" \
|
||||||
|
--description "Simple explanation generator" \
|
||||||
|
--tag "explanation" \
|
||||||
|
--variable "concept:What to explain:true"
|
||||||
```
|
```
|
||||||
|
|
||||||
2. Start the server:
|
### List Prompts
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
mcp-server server start --port 3000
|
llm-prompt prompt list
|
||||||
|
llm-prompt prompt list --tag "explanation"
|
||||||
```
|
```
|
||||||
|
|
||||||
3. The server will be available at `http://127.0.0.1:3000`
|
### Run a Prompt with Variables
|
||||||
|
|
||||||
## CLI Commands
|
|
||||||
|
|
||||||
### Server Management
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Start the MCP server
|
llm-prompt run my-prompt --var concept="quantum physics"
|
||||||
mcp-server server start --port 3000 --host 127.0.0.1
|
|
||||||
|
|
||||||
# Check server status
|
|
||||||
mcp-server server status
|
|
||||||
|
|
||||||
# Health check
|
|
||||||
mcp-server health
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Tool Management
|
### Generate a Commit Message
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# List available tools
|
# Stage some changes first
|
||||||
mcp-server tool list
|
git add .
|
||||||
|
|
||||||
# Add a custom tool
|
# Generate commit message
|
||||||
mcp-server tool add path/to/tool.yaml
|
llm-prompt commit
|
||||||
|
|
||||||
# Remove a custom tool
|
|
||||||
mcp-server tool remove tool_name
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
### Prompt Management
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `llm-prompt prompt create <name>` | Create a new prompt |
|
||||||
|
| `llm-prompt prompt list` | List all prompts |
|
||||||
|
| `llm-prompt prompt show <name>` | Show prompt details |
|
||||||
|
| `llm-prompt prompt delete <name>` | Delete a prompt |
|
||||||
|
|
||||||
|
### Tag Management
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `llm-prompt tag list` | List all tags |
|
||||||
|
| `llm-prompt tag add <prompt> <tag>` | Add tag to prompt |
|
||||||
|
| `llm-prompt tag remove <prompt> <tag>` | Remove tag from prompt |
|
||||||
|
|
||||||
|
### Search & Run
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `llm-prompt search [query]` | Search prompts |
|
||||||
|
| `llm-prompt run <name>` | Run a prompt |
|
||||||
|
|
||||||
|
### Git Integration
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `llm-prompt commit` | Generate commit message |
|
||||||
|
|
||||||
|
### Import/Export
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `llm-prompt export <output>` | Export prompts |
|
||||||
|
| `llm-prompt import <input>` | Import prompts |
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
|
|
||||||
```bash
|
| Command | Description |
|
||||||
# Show current configuration
|
|---------|-------------|
|
||||||
mcp-server config show
|
| `llm-prompt config show` | Show current config |
|
||||||
|
| `llm-prompt config set <key> <value>` | Set config value |
|
||||||
|
| `llm-prompt config test` | Test LLM connection |
|
||||||
|
|
||||||
# Generate a configuration file
|
## Prompt Format
|
||||||
mcp-server config init -o config.yaml
|
|
||||||
|
Prompts are stored as YAML files:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: explain-code
|
||||||
|
description: Explain code in simple terms
|
||||||
|
tags: [documentation, beginner]
|
||||||
|
variables:
|
||||||
|
- name: code
|
||||||
|
description: The code to explain
|
||||||
|
required: true
|
||||||
|
- name: level
|
||||||
|
description: Explanation level (beginner/intermediate/advanced)
|
||||||
|
required: false
|
||||||
|
default: beginner
|
||||||
|
template: |
|
||||||
|
Explain the following code in {{ level }} terms:
|
||||||
|
|
||||||
|
```python
|
||||||
|
{{ code }}
|
||||||
|
```
|
||||||
|
|
||||||
|
provider: ollama
|
||||||
|
model: llama3.2
|
||||||
|
```
|
||||||
|
|
||||||
|
## Template System
|
||||||
|
|
||||||
|
Use Jinja2 syntax for variable substitution:
|
||||||
|
|
||||||
|
```jinja2
|
||||||
|
Hello {{ name }}!
|
||||||
|
Your task: {{ task }}
|
||||||
|
|
||||||
|
{% for item in items %}
|
||||||
|
- {{ item }}
|
||||||
|
{% endfor %}
|
||||||
|
```
|
||||||
|
|
||||||
|
Provide variables with `--var key=value`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
llm-prompt run my-prompt --var name="Alice" --var task="review code"
|
||||||
```
|
```
|
||||||
|
|
||||||
## Configuration
|
## Configuration
|
||||||
|
|
||||||
Create a `config.yaml` file:
|
Default configuration file: `~/.config/llm-prompt-manager/config.yaml`
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
server:
|
prompt_dir: ~/.config/llm-prompt-manager/prompts
|
||||||
host: "127.0.0.1"
|
ollama_url: http://localhost:11434
|
||||||
port: 3000
|
lmstudio_url: http://localhost:1234
|
||||||
log_level: "INFO"
|
default_model: llama3.2
|
||||||
|
default_provider: ollama
|
||||||
llm:
|
|
||||||
enabled: false
|
|
||||||
base_url: "http://localhost:11434"
|
|
||||||
model: "llama2"
|
|
||||||
|
|
||||||
security:
|
|
||||||
allowed_commands:
|
|
||||||
- ls
|
|
||||||
- cat
|
|
||||||
- echo
|
|
||||||
- git
|
|
||||||
blocked_paths:
|
|
||||||
- /etc
|
|
||||||
- /root
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Environment Variables
|
## Environment Variables
|
||||||
|
|
||||||
| Variable | Description |
|
| Variable | Default | Description |
|
||||||
|----------|-------------|
|
|----------|---------|-------------|
|
||||||
| `MCP_PORT` | Server port |
|
| `LLM_PROMPT_DIR` | `~/.config/llm-prompt-manager/prompts` | Prompt storage directory |
|
||||||
| `MCP_HOST` | Server host |
|
| `OLLAMA_URL` | `http://localhost:11434` | Ollama API URL |
|
||||||
| `MCP_LOG_LEVEL` | Logging level (DEBUG, INFO, WARNING, ERROR) |
|
| `LMSTUDIO_URL` | `http://localhost:1234` | LM Studio API URL |
|
||||||
| `MCP_LLM_URL` | Local LLM base URL |
|
| `DEFAULT_MODEL` | `llama3.2` | Default LLM model |
|
||||||
|
|
||||||
## Built-in Tools
|
## Examples
|
||||||
|
|
||||||
### File Operations
|
### Create a Code Explainer Prompt
|
||||||
|
|
||||||
| Tool | Description |
|
|
||||||
|------|-------------|
|
|
||||||
| `file_tools` | Read, write, list, search, glob files |
|
|
||||||
| `read_file` | Read file contents |
|
|
||||||
| `write_file` | Write content to a file |
|
|
||||||
| `list_directory` | List directory contents |
|
|
||||||
| `glob_files` | Find files matching a pattern |
|
|
||||||
|
|
||||||
### Git Operations
|
|
||||||
|
|
||||||
| Tool | Description |
|
|
||||||
|------|-------------|
|
|
||||||
| `git_tools` | Git operations: status, log, diff, branch |
|
|
||||||
| `git_status` | Show working tree status |
|
|
||||||
| `git_log` | Show commit history |
|
|
||||||
| `git_diff` | Show changes between commits |
|
|
||||||
|
|
||||||
### Shell Execution
|
|
||||||
|
|
||||||
| Tool | Description |
|
|
||||||
|------|-------------|
|
|
||||||
| `shell_tools` | Execute shell commands safely |
|
|
||||||
| `execute_command` | Execute a shell command |
|
|
||||||
|
|
||||||
## Custom Tools
|
|
||||||
|
|
||||||
Define custom tools in YAML:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
name: my_tool
|
|
||||||
description: Description of the tool
|
|
||||||
|
|
||||||
input_schema:
|
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
param1:
|
|
||||||
type: string
|
|
||||||
description: First parameter
|
|
||||||
required: true
|
|
||||||
param2:
|
|
||||||
type: integer
|
|
||||||
description: Second parameter
|
|
||||||
default: 10
|
|
||||||
required:
|
|
||||||
- param1
|
|
||||||
|
|
||||||
annotations:
|
|
||||||
read_only_hint: true
|
|
||||||
destructive_hint: false
|
|
||||||
```
|
|
||||||
|
|
||||||
Or in JSON:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"name": "example_tool",
|
|
||||||
"description": "An example tool",
|
|
||||||
"input_schema": {
|
|
||||||
"type": "object",
|
|
||||||
"properties": {
|
|
||||||
"message": {
|
|
||||||
"type": "string",
|
|
||||||
"description": "The message to process",
|
|
||||||
"required": true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"required": ["message"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Local LLM Integration
|
|
||||||
|
|
||||||
Connect to local LLMs (Ollama, LM Studio, llama.cpp):
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
llm:
|
|
||||||
enabled: true
|
|
||||||
base_url: "http://localhost:11434"
|
|
||||||
model: "llama2"
|
|
||||||
temperature: 0.7
|
|
||||||
max_tokens: 2048
|
|
||||||
```
|
|
||||||
|
|
||||||
## Claude Desktop Integration
|
|
||||||
|
|
||||||
Add to `claude_desktop_config.json`:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"mcpServers": {
|
|
||||||
"mcp-server": {
|
|
||||||
"command": "mcp-server",
|
|
||||||
"args": ["server", "start", "--port", "3000"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Cursor Integration
|
|
||||||
|
|
||||||
Add to Cursor settings (JSON):
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"mcpServers": {
|
|
||||||
"mcp-server": {
|
|
||||||
"command": "mcp-server",
|
|
||||||
"args": ["server", "start", "--port", "3000"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Security Considerations
|
|
||||||
|
|
||||||
- Shell commands are whitelisted by default
|
|
||||||
- Blocked paths prevent access to sensitive directories
|
|
||||||
- Command timeout prevents infinite loops
|
|
||||||
- All operations are logged
|
|
||||||
|
|
||||||
## API Reference
|
|
||||||
|
|
||||||
### Endpoints
|
|
||||||
|
|
||||||
- `GET /health` - Health check
|
|
||||||
- `GET /api/tools` - List tools
|
|
||||||
- `POST /api/tools/call` - Call a tool
|
|
||||||
- `POST /mcp` - MCP protocol endpoint
|
|
||||||
|
|
||||||
### MCP Protocol
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"jsonrpc": "2.0",
|
|
||||||
"id": 1,
|
|
||||||
"method": "initialize",
|
|
||||||
"params": {
|
|
||||||
"protocol_version": "2024-11-05",
|
|
||||||
"capabilities": {},
|
|
||||||
"client_info": {"name": "client"}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Example: Read a File
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl -X POST http://localhost:3000/api/tools/call \
|
llm-prompt prompt create explain-code \
|
||||||
-H "Content-Type: application/json" \
|
--template "Explain this code: {{ code }}" \
|
||||||
-d '{"name": "read_file", "arguments": {"path": "/path/to/file.txt"}}'
|
--description "Explain code in simple terms" \
|
||||||
|
--tag "docs" \
|
||||||
|
--variable "code:The code to explain:true"
|
||||||
```
|
```
|
||||||
|
|
||||||
### Example: List Tools
|
### Export to Ollama Format
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl http://localhost:3000/api/tools
|
llm-prompt export my-prompts.yaml --format ollama
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Import from Another Library
|
||||||
|
|
||||||
|
```bash
|
||||||
|
llm-prompt import /path/to/prompts.json --force
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
| Error | Solution |
|
||||||
|
|-------|----------|
|
||||||
|
| Prompt not found | Use `llm-prompt list` to see available prompts |
|
||||||
|
| LLM connection failed | Ensure Ollama/LM Studio is running |
|
||||||
|
| Invalid YAML format | Check prompt YAML has required fields |
|
||||||
|
| Template variable missing | Provide all required `--var` values |
|
||||||
|
| Git repository not found | Run from a directory with `.git` folder |
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
MIT
|
MIT
|
||||||
|
|||||||
Reference in New Issue
Block a user