fix: resolve CI test failures (config access, mocks, imports)
Some checks failed
CI / lint (push) Has been cancelled
CI / type-check (push) Has been cancelled
CI / test (push) Has been cancelled

This commit is contained in:
2026-02-04 11:49:00 +00:00
parent 184ee18931
commit 8f4a69e2c0

101
README.md
View File

@@ -1,44 +1,115 @@
# ShellGenius # ShellGenius
AI-Powered Local Shell Script Assistant using Ollama AI-Powered Local Shell Script Assistant using Ollama.
## Overview
ShellGenius is a CLI tool that uses local LLMs (Ollama) to generate, explain, and refactor shell scripts interactively. Developers can describe what they want in natural language, and the tool generates safe, commented shell commands with explanations.
## Features ## Features
- Generate shell scripts using natural language - **Natural Language to Shell Generation**: Convert natural language descriptions into shell commands
- Review and explain existing scripts - **Interactive TUI Interface**: Rich terminal UI with navigation, command history, and suggestions
- Interactive REPL mode - **Script Explanation Mode**: Parse and explain existing shell scripts line-by-line
- Local privacy - all processing done locally via Ollama - **Safe Refactoring Suggestions**: Analyze scripts and suggest safer alternatives
- Support for multiple shell types (bash, zsh, sh) - **Command History Learning**: Learn from your command history for personalized suggestions
- **Multi-Shell Support**: Support for bash, zsh, and sh scripts
## Installation ## Installation
```bash ```bash
pip install shellgenius # Install from source
pip install .
# Install with dev dependencies
pip install -e ".[dev]"
```
## Requirements
- Python 3.10+
- Ollama running locally (https://ollama.ai)
- Recommended models: codellama, llama2, mistral
## Configuration
ShellGenius uses a `config.yaml` file for configuration. See `.env.example` for environment variables.
```yaml
ollama:
host: "localhost:11434"
model: "codellama"
timeout: 120
safety:
level: "moderate"
warn_patterns:
- "rm -rf"
- "chmod 777"
- "sudo su"
``` ```
## Usage ## Usage
### Generate a script ### Interactive Mode
```bash ```bash
shellgenius generate "Create a backup script for /data directory" shellgenius
``` ```
### Review a script ### Generate Shell Commands
```bash ```bash
shellgenius review script.sh shellgenius generate "find all Python files modified in the last 24 hours"
``` ```
### Interactive REPL ### Explain a Script
```bash ```bash
shellgenius repl shellgenius explain script.sh
``` ```
## Configuration ### Refactor with Safety Checks
Configure Ollama host and model in `~/.config/shellgenius/config.yaml` or via environment variables. ```bash
shellgenius refactor script.sh --suggestions
```
## Commands
| Command | Description |
|---------|-------------|
| `shellgenius` | Start interactive TUI |
| `shellgenius generate <description>` | Generate shell commands |
| `shellgenius explain <script>` | Explain a shell script |
| `shellgenius refactor <script>` | Analyze and refactor script |
| `shellgenius history` | Show command history |
| `shellgenius models` | List available Ollama models |
## Safety
ShellGenius includes safety features:
- Destructive command warnings
- Dry-run mode for testing
- Permission checks
- Safety level configuration
Use `--force` flag to bypass warnings if confident.
## Troubleshooting
### Ollama connection failed
- Run `ollama serve` to start Ollama
- Check `OLLAMA_HOST` environment variable
### Model not found
- Pull required model: `ollama pull <model_name>`
- Change `OLLAMA_MODEL` setting
### Timeout during generation
- Increase timeout in config.yaml
- Simplify the request
## License ## License