fix: resolve CI workflow configuration issues
This commit is contained in:
375
README.md
375
README.md
@@ -1,374 +1 @@
|
|||||||
# Local Code Assistant
|
local_code_assistant/README.md
|
||||||
|
|
||||||
A privacy-focused CLI tool for local AI code assistance using Ollama. Generate code, explain functionality, refactor for better structure, and write tests - all without sending your code to external APIs.
|
|
||||||
|
|
||||||
## Features
|
|
||||||
|
|
||||||
- **Code Generation**: Generate clean, well-documented code from natural language prompts
|
|
||||||
- **Code Explanation**: Get clear explanations of what code does and how it works
|
|
||||||
- **Code Refactoring**: Improve code structure with safe or aggressive refactoring options
|
|
||||||
- **Performance Optimization**: Optimize code for better performance and efficiency
|
|
||||||
- **Test Generation**: Automatically generate comprehensive unit tests
|
|
||||||
- **Interactive REPL**: Enter an interactive session for continuous code assistance
|
|
||||||
- **Multi-Language Support**: Python, JavaScript, TypeScript, Go, and Rust
|
|
||||||
- **Context-Aware**: Include project files for better suggestions
|
|
||||||
- **Secure Offline Operation**: All processing stays local - your code never leaves your machine
|
|
||||||
|
|
||||||
## Installation
|
|
||||||
|
|
||||||
### Prerequisites
|
|
||||||
|
|
||||||
- Python 3.9 or higher
|
|
||||||
- [Ollama](https://ollama.com) installed and running
|
|
||||||
- A local LLM model (codellama, deepseek-coder, etc.)
|
|
||||||
|
|
||||||
### Install via pip
|
|
||||||
|
|
||||||
```bash
|
|
||||||
pip install local-code-assistant
|
|
||||||
```
|
|
||||||
|
|
||||||
### Install from Source
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git clone https://github.com/local-code-assistant/local-code-assistant.git
|
|
||||||
cd local-code-assistant
|
|
||||||
pip install -e .
|
|
||||||
```
|
|
||||||
|
|
||||||
### Install Ollama Models
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install a code-focused model
|
|
||||||
ollama pull codellama
|
|
||||||
ollama pull deepseek-coder
|
|
||||||
ollama pull starcoder2
|
|
||||||
|
|
||||||
# List available models
|
|
||||||
ollama list
|
|
||||||
```
|
|
||||||
|
|
||||||
## Quick Start
|
|
||||||
|
|
||||||
### Check Connection
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant status
|
|
||||||
```
|
|
||||||
|
|
||||||
### Generate Code
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant generate "a function to calculate fibonacci numbers" --language python
|
|
||||||
```
|
|
||||||
|
|
||||||
### Explain Code
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant explain script.py
|
|
||||||
local-code-assistant explain app.ts --markdown
|
|
||||||
```
|
|
||||||
|
|
||||||
### Refactor Code
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant refactor my_module.py --safe
|
|
||||||
local-code-assistant refactor app.py -f readability -f naming -o refactored.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### Optimize Code
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant optimize slow.py -o fast.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### Generate Tests
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant test my_module.py
|
|
||||||
local-code-assistant test app.py -o test_app.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### Interactive REPL
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant repl
|
|
||||||
```
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
### Environment Variables
|
|
||||||
|
|
||||||
| Variable | Default | Description |
|
|
||||||
|----------|---------|-------------|
|
|
||||||
| `OLLAMA_BASE_URL` | http://localhost:11434 | Ollama API endpoint URL |
|
|
||||||
| `OLLAMA_MODEL` | codellama | Default model to use |
|
|
||||||
| `OLLAMA_TIMEOUT` | 8000 | Request timeout in seconds |
|
|
||||||
| `CONFIG_PATH` | ~/.config/local-code-assistant/config.yaml | Path to config file |
|
|
||||||
|
|
||||||
### Configuration File
|
|
||||||
|
|
||||||
Create `~/.config/local-code-assistant/config.yaml`:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
ollama:
|
|
||||||
base_url: http://localhost:11434
|
|
||||||
model: codellama
|
|
||||||
timeout: 8000
|
|
||||||
streaming: true
|
|
||||||
|
|
||||||
defaults:
|
|
||||||
language: python
|
|
||||||
temperature: 0.2
|
|
||||||
max_tokens: 4000
|
|
||||||
|
|
||||||
context:
|
|
||||||
max_files: 10
|
|
||||||
max_file_size: 100000
|
|
||||||
|
|
||||||
output:
|
|
||||||
syntax_highlighting: true
|
|
||||||
clipboard: true
|
|
||||||
```
|
|
||||||
|
|
||||||
## Commands
|
|
||||||
|
|
||||||
### generate
|
|
||||||
|
|
||||||
Generate code from natural language prompts.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant generate "a REST API endpoint for user authentication" \
|
|
||||||
--language python \
|
|
||||||
--output auth.py \
|
|
||||||
--temperature 0.3
|
|
||||||
```
|
|
||||||
|
|
||||||
Options:
|
|
||||||
- `--language, -l`: Programming language (default: python)
|
|
||||||
- `--output, -o`: Write generated code to file
|
|
||||||
- `--clipboard/--no-clipboard`: Copy to clipboard
|
|
||||||
- `--model, -m`: Model to use
|
|
||||||
- `--temperature, -t`: Temperature (0.0-1.0)
|
|
||||||
|
|
||||||
### explain
|
|
||||||
|
|
||||||
Explain code from a file.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant explain complex_module.py --markdown
|
|
||||||
```
|
|
||||||
|
|
||||||
Options:
|
|
||||||
- `--language, -l`: Programming language (auto-detected)
|
|
||||||
- `--markdown/--no-markdown`: Format output as markdown
|
|
||||||
|
|
||||||
### refactor
|
|
||||||
|
|
||||||
Refactor code for better structure.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant refactor legacy_code.py \
|
|
||||||
--safe \
|
|
||||||
--focus readability \
|
|
||||||
--focus naming
|
|
||||||
```
|
|
||||||
|
|
||||||
Options:
|
|
||||||
- `--focus, -f`: Focus areas (readability, structure, naming, documentation)
|
|
||||||
- `--safe/--unsafe`: Safe refactoring maintains behavior
|
|
||||||
- `--output, -o`: Write to file
|
|
||||||
- `--clipboard/--no-clipboard`: Copy to clipboard
|
|
||||||
|
|
||||||
### optimize
|
|
||||||
|
|
||||||
Optimize code for performance.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant optimize slow_algorithm.py -o optimized.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### test
|
|
||||||
|
|
||||||
Generate unit tests for code.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant test my_module.py -o test_my_module.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### repl
|
|
||||||
|
|
||||||
Enter interactive REPL mode.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant repl --model codellama --language python
|
|
||||||
```
|
|
||||||
|
|
||||||
REPL Commands:
|
|
||||||
- `:generate <prompt>` - Generate code
|
|
||||||
- `:explain` - Explain last generated code
|
|
||||||
- `:lang <language>` - Set programming language
|
|
||||||
- `:model <name>` - Set model
|
|
||||||
- `:status` - Show current settings
|
|
||||||
- `:clear` - Clear conversation
|
|
||||||
- `:quit` or Ctrl+D - Exit
|
|
||||||
|
|
||||||
### status
|
|
||||||
|
|
||||||
Check connection and model status.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant status
|
|
||||||
```
|
|
||||||
|
|
||||||
### models
|
|
||||||
|
|
||||||
List available Ollama models.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant models
|
|
||||||
```
|
|
||||||
|
|
||||||
### version
|
|
||||||
|
|
||||||
Show version information.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant version
|
|
||||||
```
|
|
||||||
|
|
||||||
## Supported Languages
|
|
||||||
|
|
||||||
| Language | Extensions | Testing Framework |
|
|
||||||
|----------|------------|-------------------|
|
|
||||||
| Python | .py, .pyw, .pyi | pytest |
|
|
||||||
| JavaScript | .js, .mjs, .cjs | jest |
|
|
||||||
| TypeScript | .ts, .tsx | jest |
|
|
||||||
| Go | .go | testing |
|
|
||||||
| Rust | .rs | test |
|
|
||||||
|
|
||||||
## Recommended Models
|
|
||||||
|
|
||||||
- **codellama**: General purpose code generation
|
|
||||||
- **deepseek-coder**: High-quality code completion
|
|
||||||
- **starcoder2**: Multi-language support
|
|
||||||
- **qwen2.5-coder**: Balanced performance
|
|
||||||
- **phi4**: Efficient code understanding
|
|
||||||
|
|
||||||
## Project Context
|
|
||||||
|
|
||||||
When generating or refactoring code, you can include project files for better context:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
local-code-assistant generate "add error handling" --context --max-files 5
|
|
||||||
```
|
|
||||||
|
|
||||||
## Development
|
|
||||||
|
|
||||||
### Setup Development Environment
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git clone https://github.com/local-code-assistant/local-code-assistant.git
|
|
||||||
cd local-code-assistant
|
|
||||||
pip install -e ".[dev]"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Running Tests
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run all tests
|
|
||||||
pytest tests/ -v
|
|
||||||
|
|
||||||
# Run with coverage
|
|
||||||
pytest tests/ --cov=local_code_assistant --cov-report=term-missing
|
|
||||||
|
|
||||||
# Run specific test file
|
|
||||||
pytest tests/test_cli.py -v
|
|
||||||
```
|
|
||||||
|
|
||||||
### Code Quality
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Format code
|
|
||||||
black local_code_assistant/
|
|
||||||
|
|
||||||
# Lint code
|
|
||||||
ruff check local_code_assistant/
|
|
||||||
|
|
||||||
# Type checking
|
|
||||||
mypy local_code_assistant/
|
|
||||||
```
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
local_code_assistant/
|
|
||||||
├── cli.py # Main CLI entry point
|
|
||||||
├── commands/
|
|
||||||
│ ├── base.py # Base command class
|
|
||||||
│ ├── generate.py # Code generation command
|
|
||||||
│ ├── explain.py # Code explanation command
|
|
||||||
│ ├── refactor.py # Code refactoring command
|
|
||||||
│ ├── test.py # Test generation command
|
|
||||||
│ └── repl.py # Interactive REPL
|
|
||||||
├── services/
|
|
||||||
│ ├── ollama.py # Ollama API client
|
|
||||||
│ └── config.py # Configuration management
|
|
||||||
├── prompts/
|
|
||||||
│ └── templates.py # Prompt templates and language config
|
|
||||||
├── utils/
|
|
||||||
│ ├── context.py # Project context building
|
|
||||||
│ └── language.py # Language detection utilities
|
|
||||||
└── tests/ # Test suite
|
|
||||||
```
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Cannot connect to Ollama
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Make sure Ollama is running
|
|
||||||
ollama serve
|
|
||||||
|
|
||||||
# Check if Ollama is accessible
|
|
||||||
curl http://localhost:11434/api/tags
|
|
||||||
```
|
|
||||||
|
|
||||||
### Model not found
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Pull the model
|
|
||||||
ollama pull codellama
|
|
||||||
|
|
||||||
# List installed models
|
|
||||||
ollama list
|
|
||||||
```
|
|
||||||
|
|
||||||
### Slow responses
|
|
||||||
|
|
||||||
- Reduce `max_tokens` in configuration
|
|
||||||
- Use a smaller model
|
|
||||||
- Increase `OLLAMA_TIMEOUT`
|
|
||||||
|
|
||||||
### Clipboard not working
|
|
||||||
|
|
||||||
- Install pyperclip dependencies:
|
|
||||||
- Linux: `sudo apt-get install xclip` or `xsel`
|
|
||||||
- macOS: Already supported
|
|
||||||
- Windows: Already supported
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
Contributions are welcome! Please read our contributing guidelines before submitting PRs.
|
|
||||||
|
|
||||||
## License
|
|
||||||
|
|
||||||
MIT License - see LICENSE file for details.
|
|
||||||
|
|
||||||
## Security
|
|
||||||
|
|
||||||
This tool is designed with privacy in mind:
|
|
||||||
- All processing happens locally
|
|
||||||
- No external API calls (except to your local Ollama instance)
|
|
||||||
- No telemetry or data collection
|
|
||||||
- Your code never leaves your machine
|
|
||||||
Reference in New Issue
Block a user