Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>CodeMate OllamaNew to Visual Studio Code? Get it now.
CodeMate Ollama

CodeMate Ollama

Jaffar Hussain

|
81 installs
| (0) | Free
Local AI coding assistant powered by Ollama. Read, analyze, and refactor code with full workspace understanding. No cloud required.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

CodeMate Ollama

Your local AI coding assistant - powered by Ollama. No cloud, no API keys, complete privacy.

CodeMate is a VS Code extension that transforms Ollama into a powerful coding assistant with direct workspace integration. Read files, understand code structure, refactor, create new files, and execute operations—all with user approval for safety.

✨ Features

🧠 AI-Powered Code Understanding

  • Workspace Analysis: Automatically understand your project structure
  • Code Reading: Read and analyze any file in your workspace
  • Smart Context: Remembers recent files and provides contextual assistance
  • Real-Time Streaming: See responses appear instantly as they're generated

🔧 Code Operations (With Approval)

  • Read Files: Understand code without modifications
  • Create Files: Generate new files with full code
  • Modify Files: Apply targeted edits with diff preview
  • Manage Files: Copy, rename, delete, and organize your workspace
  • Diff Preview: See exactly what changes before approving

🛡️ Safety & Control

  • Approval Workflow: All write/delete operations require your explicit approval
  • Diff Preview: Visual comparison of old vs new content
  • Protected Files: .git/, node_modules/, .env, and secrets are protected
  • Local Processing: All computation happens locally—no data sent to cloud

🚀 Performance

  • Fast: LLM runs on your machine for instant responses
  • Streaming: Real-time response display
  • Lightweight: ~111KB VS Code extension
  • Responsive UI: Smooth animations and clear feedback

🎯 Quick Start

Prerequisites

  • Ollama installed and running (ollama serve)
  • Model available: ollama pull llama3.1:8b
  • VS Code 1.108+

Installation

  1. Search "CodeMate Ollama" in VS Code Extensions
  2. Click Install
  3. Click the robot icon 🤖 in activity bar
  4. Select model and start chatting!

📋 Supported Models

✅ Full Features (With Tools)

  • llama3.1:8b - Recommended (best balance)
  • qwen3-coder - Excellent for coding
  • mistral - Fast and capable

⚠️ Chat Only (No File Operations)

  • deepseek-coder - Great code generation
  • phi - Lightweight
  • Any model without tool support

CodeMate gracefully degrades to chat-only if needed!

🔐 Privacy First

✅ 100% local processing ✅ No API calls or cloud sync ✅ No data leaves your machine ✅ Protected file detection ✅ Approval required for all changes

🚀 Usage Examples

"Refactor this function for readability"
→ CodeMate reads, proposes changes, shows diff, waits for approval

"Create a React component with TypeScript"
→ Generates code, previews, executes after approval

"Explain what this module does"
→ Analyzes file, provides detailed explanation

"Find all TODO comments"
→ Searches workspace, lists with context

⚙️ Configuration

Settings Panel

  • Ollama URL: Custom server (default: http://localhost:11434)
  • Model Selection: Choose from available models

Enable Context

Check "Include Active File" to:

  • Include current editor content
  • Get specific suggestions
  • Refactor with full context

🛠️ Features In Detail

File Operations (Safe & Approved)

  • ✅ Read files automatically
  • ✅ Create new files with preview
  • ✅ Modify files with diff
  • ✅ Delete/rename/copy with confirmation
  • ✅ Protected files stay safe

Real-Time Feedback

  • [↻ Thinking...] Loading indicator
  • Streaming responses
  • Stop button during generation
  • ✅ Tool execution status

Model Compatibility

  • Auto-detects model capabilities
  • Falls back gracefully
  • Shows clear notifications
  • Works with any Ollama model

🐛 Troubleshooting

"Bad Request" Error

  • Model doesn't support tools
  • Extension falls back to chat mode
  • All functionality still works!

Models not showing

  • Ensure Ollama running: ollama serve
  • Check URL in settings
  • Try refreshing

File operations fail

  • Verify workspace write permissions
  • Check file isn't protected
  • Try simpler operation

📊 What's Inside

  • 10 workspace tools
  • Streaming responses
  • Diff preview system
  • Model detection
  • Safety approvals
  • Context awareness
  • Beautiful UI

📝 License

MIT - Open source and free

🎉 Get Started Now

  1. Install from VS Code marketplace
  2. Start Ollama: ollama serve
  3. Click the robot icon 🤖
  4. Choose your model
  5. Start coding with AI! 🚀

All local. All private. All powerful.

Your code, your AI, your way. Welcome to CodeMate Ollama! 🤖✨

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft