Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>LocalChatNew to Visual Studio Code? Get it now.
LocalChat

LocalChat

Preview

bb1

|
68 installs
| (1) | Free
A 100% locally running AI code assistant for VS Code. Choose between all the cool opensource models available.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LocalChat - Your Private AI Coding Assistant

LocalChat is a VS Code extension that brings the power of AI code assistance to your development environment while keeping everything 100% local and private. Think of it as your personal, offline coding companion.

✨ Key Features

  • 🔒 100% Local Operation: All AI processing happens on your machine
  • 🤖 AI Code Assistance: Get code suggestions, explanations, and help
  • 🔐 Privacy Focused: No data leaves your system
  • ✅ GDPR Compliant: Your code stays with you
  • 🌐 Flexible Setup: Use Ollama locally or on your private server

📋 Requirements

  • VS Code 1.97.0 or higher
  • Ollama installed locally or running on a remote server
  • Sufficient system resources to run your chosen AI model

⚙️ Setup

  1. Install the extension
  2. Install Ollama on your system or server (see Ollama installation instructions)
  3. You may need to set the environment variable OLLAMA_HOST=0.0.0.0:11434 to allow connections from VS Code on MacOS and when running Ollama on a remote server
  4. Choose your preferred AI model through the command palette (Ctrl+Shift+P -> "Select AI Model for offline chat") or the bottom status bar
  5. Start chatting!

🛠️ Extension Settings

This extension contributes the following settings:

  • localchat.selectedModel: Choose which AI model to use (default: "deepseek-r1:7b")

🤝 Privacy Commitment

LocalChat is built with privacy at its core:

  • No telemetry
  • No cloud dependencies
  • No data collection
  • Everything stays on your system or designated server

🆘 Known Issues

Please report any issues on our GitHub repository.

📝 Release Notes

[0.0.2]

  • Added highlighting for the LLM reasoning steps
  • Improved documentation for setup instructions
  • Added esbuild for bundling the extension code
  • Fixed major issue that the extension wouldn't run without manual npm install

LocalChat - Code with confidence, privately.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft