LocalAI Chat — VS Code Extension
Chat with local AI models (Ollama, LM Studio, or any OpenAI-compatible server) directly inside VS Code.
Features
- 💬 Full chat interface in the sidebar, like ChatGPT but local
- 🔌 Auto-detects models from Ollama, LM Studio, or custom endpoints
- 📂 Editor context — select code and it's automatically included
- 🖱️ Right-click menu — "Explain Selection" and "Fix Selection"
- ⬇️ Insert code directly into the active editor from any response
- 🔄 Streaming responses with real-time token display
- ⚙️ Configurable provider, model, and system prompt
Installation
1. Install the extension
From VSIX file (recommended):
Extensions panel → ⋯ → Install from VSIX → select localai-chat-1.0.0.vsix
From source:
npm install
npm run compile
# Then install the generated .vsix
2. Start your local AI provider
Ollama (default, port 11434):
ollama serve
ollama pull llama3.2 # or any model
LM Studio: Start the local server in the app (port 1234)
Other OpenAI-compatible APIs: Configure the custom URL in settings
Open Settings → search localai:
| Setting |
Default |
Description |
localai.provider |
ollama |
Provider: ollama, lmstudio, custom |
localai.ollamaUrl |
http://localhost:11434 |
Ollama endpoint |
localai.lmstudioUrl |
http://localhost:1234 |
LM Studio endpoint |
localai.customUrl |
http://localhost:8080 |
Custom endpoint |
localai.defaultModel |
(auto) |
Pre-selected model |
localai.systemPrompt |
coding assistant |
System prompt |
Usage
| Action |
How |
| Open chat |
Click the ⚡ icon in the activity bar, or Ctrl+Shift+L |
| Send message |
Type and press Enter |
| New line |
Shift+Enter |
| Explain code |
Select code → right-click → LocalAI: Explain Selection |
| Fix code |
Select code → right-click → LocalAI: Fix Selection |
| Insert AI code |
Click insert button on any code block in the response |
| Copy code |
Click copy button on any code block |
| Switch model |
Use the dropdown at the top of the panel |
| Clear chat |
Click ✕ in the header |
Building from source
npm install
npm run compile
npm install -g @vscode/vsce
vsce package
This produces localai-chat-1.0.0.vsix which can be installed in any VS Code.