🦙 Ollama Agent - Local AI Coding Assistant

Transform your locally-running Ollama models into an intelligent coding agent with capabilities rivaling GitHub Copilot and Cursor—all running 100% locally with complete privacy. No cloud, no subscriptions, no data leaks.
🎉 New in v1.1: AI Completions, Quick Edit (Cmd+K), Code Review, Undo/Redo, Snippets Library, Prompt Templates, Theme Toggle, Favorites, and more!
✨ Key Features
🚀 Copilot-Like Features
- ✍️ AI Code Completions: Ghost text suggestions as you type (like GitHub Copilot)
- ⚡ Quick Edit (Cmd+K): Inline code editing with natural language instructions
- 🎯 Quick Actions Bar: One-click access to Explain, Fix, Optimize, Test, and more
- 📋 Prompt Templates Library: 10 pre-built templates for common tasks
- 💾 Code Snippets Library: Reusable code snippets for React, Node.js, Python, TypeScript
- ⏪ Undo/Redo System: Full history of AI edits with one-click undo (Ctrl+Alt+Z)
- 🔍 Code Review Mode: Comprehensive AI-powered code reviews
🎨 Enhanced UI/UX
- 🌓 Light/Dark Theme: Toggle between themes with sun/moon icon
- ⭐ Thread Favorites: Star important conversations and filter favorites
- 💬 Message Reactions: React to AI responses with 👍 👎 ❤️ 🎉
- ⌨️ Keyboard Shortcuts Panel: View all shortcuts in one place
- ✨ Smooth Animations: Professional fade-in, slide, and hover effects
- 📊 Performance Metrics: Real-time token count, response time, tokens/sec display
🔧 Advanced Context Gathering
- @terminal: Include terminal output in your prompts
- @diagnostics: Add compiler errors and warnings
- @workspace: Include project file structure
- @git-diff: Add uncommitted changes context
- @filename: Attach specific files to messages
- 🔍 History Search: Instantly find past conversations
🤖 Intelligent Agent Mode
- Autonomous Task Execution: Agent can read files, search codebases, create/edit files, and run tasks
- Multi-Step Reasoning: Up to 8-step problem-solving with context awareness
- Smart Tool Usage: Automatically explores your project structure and applies changes
- Read & Agent Modes: Choose between safe read-only or full agent capabilities
💬 Interactive Chat Interface
- Modern WebView UI: Clean, responsive chat interface with streaming responses
- Multi-Model Support: Chat with multiple Ollama models simultaneously
- Context-Aware: Automatically includes selected code, open files, and project structure
- Conversation History: Keep track of all interactions within sessions
📝 Code Generation & Editing
- Inline Code Generation: Generate code directly at cursor position
- Smart Refactoring: Simplify, optimize, or add types to selected code
- Function Extraction: Automatically extract functions with AI assistance
- Code Translation: Convert code between languages
- Multi-File Editing: Process multiple @mentioned files in one operation
- Direct Apply: Changes applied instantly with visual feedback
📚 Documentation & Analysis
- README Generator: Dedicated mode to generate comprehensive project documentation
- Docstring Completion: AI-generated documentation for functions and classes
- Code Explanation: Understand complex code with AI explanations
- PR Summaries: Generate pull request descriptions and review comments
- Code Review: Get detailed analysis of code quality, security, performance, and bugs
🧪 Testing & Quality
- Unit Test Generation: Create comprehensive test suites automatically
- Problem Analysis: AI-powered debugging and error resolution
- Code Review: Get suggestions for improvements and best practices
- Project Indexer: Fast semantic search across your entire codebase
- Commit Message Generation: Smart git commit messages from staged changes
- Shell Command Explanation: Understand complex terminal commands
- File Creation with AI: Generate boilerplate files with context
🎨 Customization
- Configurable Settings: Host, port, system prompts, and behavior modes
- Keyboard Shortcuts: Quick access to all commands
- Context Menu Integration: Right-click access in editor
- Status Bar Integration: Quick model switching and status updates
📦 Installation
Prerequisites
- Install Ollama: Download from ollama.ai
- Pull Models: Choose your preferred models
ollama pull llama3.1:8b
ollama pull mistral
ollama pull codellama
ollama pull deepseek-coder
- Start Ollama: Ensure it's running on
localhost:11434
ollama serve
Install Extension
From VS Code Marketplace (Coming Soon):
- Open VS Code
- Go to Extensions (
Ctrl+Shift+X / Cmd+Shift+X)
- Search for "Ollama Agent"
- Click Install
From VSIX (Manual):
- Download the
.vsix file from GitHub Releases
- Open VS Code
- Run command:
Extensions: Install from VSIX...
- Select the downloaded file
🚀 Quick Start
1. Open Chat Panel
- Click the Ollama icon in the Activity Bar (left sidebar)
- Or run command:
Ollama Agent: Open Chat (Ctrl+Alt+K)
- The chat panel will open with Quick Actions bar
2. Try Quick Actions (NEW!)
- Click ⚡ Quick Actions to expand the action bar
- Choose from 8 one-click actions: Explain, Fix, Optimize, Tests, Document, Refactor, Debug, Review
- Actions automatically populate the input with slash commands
3. Use AI Completions (NEW!)
- Enable in Settings:
ollamaAgent.enableInlineCompletions
- Start typing code and see ghost text suggestions appear
- Press
Tab to accept, Esc to dismiss
- Toggle on/off with status bar icon
4. Quick Edit with Cmd+K (NEW!)
- Select code and press
Ctrl+K (or Cmd+K on Mac)
- Type instruction: "add error handling" or "optimize this"
- Changes apply instantly with visual feedback
- Use
Ctrl+Alt+Z to undo any AI edit
5. Select Agent Mode
- Read Mode (Safe): AI can only read and suggest changes
- Agent Mode (Powerful): AI can automatically create/edit files
3. Start Chatting
- Type your question or request
- Use @filename to attach files:
@src/app.ts explain this
- Use @terminal, @diagnostics, @workspace, @git-diff for context
- Select one or more models to query
- Get streaming responses with real-time metrics
4. Explore New Features
- 📋 Templates Button: Access 10 prompt templates for common tasks
- 💾 Snippets Button: Copy ready-to-use code snippets
- ⌨️ Keyboard Shortcuts: View all shortcuts in modal
- 🌓 Theme Toggle: Switch between light/dark themes
- ⭐ Favorites: Star important conversations
- 🔍 Search: Find past conversations instantly
5. Use Context Menu Commands
- Select code → Right-click → Ollama Agent options
- Ask about selection, generate code, refactor, explain, and more
⚙️ Configuration
Settings
Access via File > Preferences > Settings → Search "Ollama Agent"
| Setting |
Default |
Description |
ollamaAgent.host |
localhost |
Ollama server host |
ollamaAgent.port |
11434 |
Ollama server port |
ollamaAgent.systemPrompt |
You are a helpful coding assistant. |
Default system prompt |
ollamaAgent.provider |
ollama |
AI provider (ollama/openai/anthropic) |
ollamaAgent.mode |
read |
Interaction mode (read/agent) |
ollamaAgent.enableInlineCompletions |
false |
NEW: Enable ghost text code completions |
ollamaAgent.enableExplainOnHover |
false |
Experimental: hover explanations |
ollamaAgent.enableDocstringCompletion |
false |
Experimental: docstring suggestions |
ollamaAgent.applyPreviewAlways |
false |
NEW: Show diff preview before applying |
ollamaAgent.applyConfirmInline |
true |
NEW: Apply changes with Accept/Decline UI |
Example Configuration
{
"ollamaAgent.host": "localhost",
"ollamaAgent.port": 11434,
"ollamaAgent.mode": "agent",
"ollamaAgent.systemPrompt": "You are an expert TypeScript developer focused on clean code and best practices."
}
📖 Usage Examples
Generate a Function
- Place cursor where you want the function
- Run:
Ollama Agent: Generate Code Here
- Describe what you want: "Create a function to validate email addresses"
- Review and insert the generated code
Refactor Code
- Select the code to refactor
- Right-click →
Ollama Agent: Brush – Simplify
- AI will suggest cleaner, more readable code
- Review changes in diff view and apply
Create Unit Tests
- Select a function or class
- Run:
Ollama Agent: Generate Unit Tests
- AI generates comprehensive test suite
- Tests appear in new file or clipboard
Generate README
- Run:
Ollama Agent: README Generator
- Choose style (Technical/Product/Library)
- Choose placement (GitHub/Marketplace/Website)
- Add custom notes
- Enable deep scan for comprehensive analysis
- AI generates complete README with all sections
Debug with AI
- Click on a problem in Problems panel
- Run:
Ollama Agent: Fix This Problem
- AI analyzes error and suggests fixes
- Apply fix with one click
🎯 Command Reference
New Commands (v1.1)
Ollama Agent: Quick Edit (Ctrl+K / Cmd+K) - Inline code editing
Ollama Agent: Code Review - Comprehensive AI code review
Ollama Agent: Undo AI Edit (Ctrl+Alt+Z) - Undo last AI change
Ollama Agent: Redo AI Edit (Ctrl+Alt+Shift+Z) - Redo AI change
Ollama Agent: Show Edit History - View all AI edits
Ollama Agent: Clear Edit History - Reset edit history
Ollama Agent: Toggle Completions - Enable/disable ghost text
Chat & Interaction
Ollama Agent: Open Chat - Open main chat panel
Ollama Agent: README Generator - Generate project documentation
Ollama Agent: Ask About Selection - Query AI about selected code
Code Generation
Ollama Agent: Generate Code Here - Insert AI-generated code
Ollama Agent: Generate Unit Tests - Create test suite
Ollama Agent: Generate Commit Message - Smart git messages
Ollama Agent: Generate README - Auto-generate documentation
Code Editing
Ollama Agent: Edit Selection by Instruction - Guided refactoring
Ollama Agent: Brush – Simplify - Simplify code
Ollama Agent: Brush – Add Types - Add TypeScript types
Ollama Agent: Brush – Optimize - Performance optimization
Ollama Agent: Extract Function - Extract selected code
Analysis & Explanation
Ollama Agent: Explain Selection - Understand code
Ollama Agent: Analyze Problems - Debug assistance
Ollama Agent: Explain Command - Shell command help
Ollama Agent: Show Index Stats - View codebase stats
Project Management
Ollama Agent: PR Summary - Generate PR descriptions
Ollama Agent: PR Review Comments - AI code review
Ollama Agent: Rebuild Project Index - Refresh semantic search
Ollama Agent: Add File (AI) - Generate boilerplate files
Translation
Ollama Agent: Translate Selection - Convert between languages
⌨️ Keyboard Shortcuts
| Action |
Windows/Linux |
Mac |
| Open Chat |
Ctrl+Alt+K |
Cmd+Alt+K |
| Quick Edit |
Ctrl+K |
Cmd+K |
| Explain Selection |
Ctrl+Alt+E |
Cmd+Alt+E |
| Generate Tests |
Ctrl+Alt+T |
Cmd+Alt+T |
| Undo AI Edit |
Ctrl+Alt+Z |
Cmd+Alt+Z |
| Redo AI Edit |
Ctrl+Alt+Shift+Z |
Cmd+Alt+Shift+Z |
| Send Message |
Enter |
Enter |
| New Line |
Shift+Enter |
Shift+Enter |
🔧 Troubleshooting
Ollama Not Connecting
Problem: "Failed to connect to Ollama"
Solutions:
- Verify Ollama is running:
ollama serve
- Check settings: host=
localhost, port=11434
- Test connection:
curl http://localhost:11434/api/tags
- Restart VS Code
No Models Available
Problem: "No models found"
Solutions:
- Pull a model:
ollama pull llama3.1
- List models:
ollama list
- Refresh extension: Run
Developer: Reload Window
Slow Response Times
Problem: AI responses are very slow
Solutions:
- Use smaller models (e.g.,
llama3.1:8b instead of 70b)
- Reduce context: Disable deep scan in README generator
- Close other resource-intensive applications
- Consider using GPU acceleration with Ollama
Agent Mode Not Working
Problem: Agent mode doesn't apply changes
Solutions:
- Check mode setting:
ollamaAgent.mode = agent
- Verify file permissions (read/write access)
- Check workspace trust: File → Trust Workspace
- Review error messages in Output panel
Extension Not Loading
Problem: Extension doesn't activate
Solutions:
- Check VS Code version (minimum 1.105.0)
- View logs: Output → Ollama Agent
- Disable conflicting extensions
- Reinstall extension
🛠️ Development
Setup
# Clone repository
git clone https://github.com/IamNishant51/Ollama-Agent-vs-code-extension.git
cd Ollama-Agent-vs-code-extension
# Install dependencies
npm install
# Build webview
npm run build:webview
# Compile TypeScript
npm run compile
# Watch mode (auto-recompile)
npm run watch
Publishing
See PUBLISHING.md for a step-by-step guide to package and publish this extension to the VS Code Marketplace.
Testing
# Run linter
npm run lint
# Run tests
npm test
Debug
- Open project in VS Code
- Press
F5 to launch Extension Development Host
- Test features in new window
- View logs in Debug Console
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide for details.
Areas for Contribution
- Add support for more AI providers
- Improve agent reasoning capabilities
- Add more code generation templates
- Enhance UI/UX with additional themes
- Write comprehensive tests
- Improve documentation
- Report bugs and suggest features
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Ollama for providing the local LLM runtime
- VS Code for the excellent extension API
- The open-source community for inspiration and support
📞 Support
🗺️ Roadmap
Completed in v1.1 ✅
- [x] AI Code Completions (ghost text)
- [x] Quick Edit with Cmd+K
- [x] Multi-file editing support
- [x] Code Review mode
- [x] Undo/Redo system for AI edits
- [x] Prompt Templates library
- [x] Code Snippets library
- [x] Theme toggle (light/dark)
- [x] Thread favorites
- [x] Message reactions
- [x] Performance metrics display
- [x] Advanced context gathering (@terminal, @diagnostics, etc.)
- [x] History search
Coming Soon
- [ ] Voice input support
- [ ] Model comparison view
- [ ] Custom user templates
- [ ] Extension marketplace for agent tools
- [ ] Collaborative coding sessions
- [ ] Integration with popular dev tools
- [ ] Mobile companion app
- [ ] Advanced diff view with multiple algorithms
⭐ Star History
If you find this extension useful, please consider giving it a star on GitHub!
Made with ❤️ by Nishant Unavane
Powered by Ollama • Privacy-First • Open Source