LocalChat is a VS Code extension that brings the power of AI code assistance to your development environment while keeping everything 100% local and private. Think of it as your personal, offline coding companion.
✨ Key Features
🔒 100% Local Operation: All AI processing happens on your machine
🤖 AI Code Assistance: Get code suggestions, explanations, and help
🔐 Privacy Focused: No data leaves your system
✅ GDPR Compliant: Your code stays with you
🌐 Flexible Setup: Use Ollama locally or on your private server
📋 Requirements
VS Code 1.97.0 or higher
Ollama installed locally or running on a remote server
Sufficient system resources to run your chosen AI model
You may need to set the environment variable OLLAMA_HOST=0.0.0.0:11434 to allow connections from VS Code on MacOS and when running Ollama on a remote server
Choose your preferred AI model through the command palette (Ctrl+Shift+P -> "Select AI Model for offline chat") or the bottom status bar
Start chatting!
🛠️ Extension Settings
This extension contributes the following settings:
localchat.selectedModel: Choose which AI model to use (default: "deepseek-r1:7b")
🤝 Privacy Commitment
LocalChat is built with privacy at its core:
No telemetry
No cloud dependencies
No data collection
Everything stays on your system or designated server
🆘 Known Issues
Please report any issues on our GitHub repository.
📝 Release Notes
[0.0.2]
Added highlighting for the LLM reasoning steps
Improved documentation for setup instructions
Added esbuild for bundling the extension code
Fixed major issue that the extension wouldn't run without manual npm install