🦙 Ollama Code Fixer - AI Coding Assistant
Comprehensive AI-powered coding assistant using local Ollama models in VS Code.

🌟 Features
🔧 Code Operations
- Fix Code - Automatic error and bug fixing
- Optimize Code - Performance and readability optimization
- Explain Code - Detailed explanations of code logic
- Add Comments - Generate comments and documentation
- Generate Tests - Create unit tests with edge case coverage
- Refactor Code - Improve structure and architecture
- Security Check - Analyze security vulnerabilities
- Generate Code - Create code from descriptions
- Translate Code - Convert between programming languages
🚀 Ollama Management
- Auto-start Server - Automatic Ollama startup when needed
- Model Selection - Quick switching between installed models
- Model Installation - Download and install new AI models
- Status Monitoring - Real-time API status tracking
🎯 Smart Features
- Auto Code Insertion - Configurable change application
- Preview Mode - Preview changes before applying
- Flexible Positioning - Replace, insert above/below, or new file
- Backup Creation - Automatic original code backup
- Intelligent Chat - Full-featured AI coding assistant
🛠 Installation
Prerequisites
- Ollama - Download and install
- VS Code version 1.85.0 or higher
Extension Installation
- Open VS Code
- Go to Extensions (Ctrl+Shift+X)
- Search for "Ollama Code Fixer"
- Click Install
Ollama Setup
# Install a model (e.g.)
ollama pull codellama:7b
# Start the server
ollama serve
🎮 Usage
- Select code in the editor
- Right-click → choose operation:
- 🔧 Fix Code
- ⚡ Optimize Code
- 📝 Explain Code
- 💬 Add Comments
- 🧪 Generate Tests
- 🔄 Refactor Code
- 🔒 Security Check
Command Palette
- Press Ctrl+Shift+P
- Type "Ollama" to view all commands
- Click Ollama icon in Activity Bar
- Use the tool panel:
- 🚀 Start Ollama - start server
- 🎯 Select Model - choose model
- 📥 Install Model - install new model
- 💬 Open AI Chat - open chat
Code Generation
- Place cursor where needed
- Ctrl+Shift+P → "Generate Code"
- Describe what you want to create
- AI generates code from description
⚙️ Configuration
Basic Settings
{
"ollamaCodeFixer.modelName": "codellama:7b",
"ollamaCodeFixer.ollamaApiUrl": "http://localhost:11434",
"ollamaCodeFixer.language": "en",
"ollamaCodeFixer.autoApplyChanges": false,
"ollamaCodeFixer.showPreviewBeforeApply": true
}
Behavior Settings
- autoApplyChanges - Auto-apply without confirmation
- insertPosition - Where to insert code:
replace
, above
, below
, newFile
- backupOriginalCode - Create backup copies
- autoStartOllama - Auto-start Ollama server
Model Settings
- temperature (0.0-2.0) - Response creativity
- topP (0.0-1.0) - Nucleus sampling
- topK (1-100) - Top-k sampling
- maxTokens - Maximum response length
- contextLength - Context size
🌐 Multilingual Support
Supported interface languages:
- 🇺🇸 English (default)
- 🇷🇺 Russian
- 🇺🇦 Ukrainian
- 🇪🇸 Spanish
{
"ollamaCodeFixer.language": "en" // en, ru, uk, es
}
Change language: Ctrl+Shift+P → "🌐 Change Language"
🔧 Supported Programming Languages
- Web: JavaScript, TypeScript, HTML, CSS, SCSS
- Backend: Python, Java, C#, Go, Rust, PHP, Ruby
- Mobile: Swift, Kotlin, Dart
- Systems: C, C++, Scala
- Data: SQL, R, Julia
- DevOps: Shell, PowerShell, Docker, YAML
- And many more...
🎯 Recommended Models
For Programming
codellama:7b
- Optimal balance of speed and quality
codellama:13b
- Higher quality
codegemma:7b
- Fast code processing
General Purpose
llama2:7b
- Good overall quality
mistral:7b
- Fast and efficient
gemma:7b
- Balanced performance
📊 Usage Examples
Error Fixing
# Select problematic code
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2) # Inefficient
# Right-click → Fix Code
# Get optimized version with memoization
Test Generation
// Select function
function validateEmail(email) {
const re = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
return re.test(email);
}
// Right-click → Generate Tests
// Get comprehensive test suite
Code Explanation
-- Select complex query
SELECT u.name, COUNT(o.id) as order_count,
AVG(o.total) as avg_order_value
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
WHERE u.created_at > '2024-01-01'
GROUP BY u.id, u.name
HAVING COUNT(o.id) > 5;
-- Right-click → Explain Code
-- Get detailed logic explanation
🔄 Workflow
- Select code or place cursor
- Choose operation via context menu or Command Palette
- Preview result (if preview is enabled)
- Apply changes or save to new file
- Use chat for additional questions
🛡️ Security
- Local Processing - Your code never leaves your machine
- No Internet Requests - Everything works through local Ollama
- Backup Creation - Automatic original code backup
- Preview Mode - Control before applying changes
🎨 User Interface
Status Bar
- $(check) Ollama: Active - Server running
- $(error) Ollama: Offline - Server unavailable
- Click to check status
Complete toolset with icons:
- 🔧 Code fixing
- ⚡ Optimization
- 📝 Explanations
- 💬 Comments
- 🧪 Tests
- 🔄 Refactoring
- 🔒 Security
- ✨ Generation
- 💬 AI Chat
🔍 Debugging
Logging
{
"ollamaCodeFixer.logLevel": "debug"
}
Log levels:
error
- Errors only
warn
- Warnings
info
- Information (default)
debug
- Detailed debugging
Status Check
- Command Palette → "Ollama: Check API Status"
- Status Bar → click Ollama status
- Output Panel → "Ollama Code Fixer"
🤝 Contributing
- Fork the project
- Create feature branch
- Make changes
- Add tests
- Create Pull Request
📋 TODO
- [ ] CodeLens support
- [ ] Git integration
- [ ] Full project analysis
- [ ] Custom prompts
- [ ] Framework-specific plugins
- [ ] Cloud model support
📝 Changelog
v0.3.0 - 2024-12-19
🚀 New Features
- Full AI agent with 9 different code operations
- Multilingual support - 4 interface languages (EN, RU, UK, ES)
- Ollama management - auto-start, model selection and installation
- Automatic model installation with real-time progress bar
- Flexible insertion settings - replace, above, below or new file
- Context menu with full operation set for selected code
- Preview mode for changes before applying
- Backup creation of original code to clipboard
- Auto-start Ollama when server unavailable
🎨 Interface Improvements
- New icons for all operations
- Enhanced sidebar with extended functionality
- Informative status bar with clickable status
- Welcome message for new users
⚙️ Settings
- Extended model parameters (temperature, top_p, top_k)
- Behavior settings (auto-apply, preview, position)
- Model management (ollama path, preferred models)
📄 License
MIT License - see LICENSE file for details.
🔗 Useful Links
🤝 Contribute
This project is open for contributions! If you have ideas for improvement:
- Fork this repository
- Create feature branch (
git checkout -b feature/AmazingFeature
)
- Commit your changes (
git commit -m 'Add some AmazingFeature'
)
- Push to branch (
git push origin feature/AmazingFeature
)
- Open Pull Request
Made with ❤️ for developers using local AI models
