Skip to content
| Marketplace
Sign in
Visual Studio>Tools>Local AI Assistant
Local AI Assistant

Local AI Assistant

Local AI Assistant

|
40 installs
| (0) | Free
A Visual Studio extension that enables direct use of local LLMs (such as Ollama and LM Studio). It allows code generation, refactoring, and explanations to be completed entirely in a local environment while preserving privacy.
Download

Here is the English translation formatted in Markdown, ready to be used in a README.


LocalAIContinueVS

LocalAIContinueVS is a Visual Studio extension that allows you to use local LLMs (such as Ollama, LM Studio, etc.) directly within the IDE. Through a chat interface, you can generate code, refactor, and get explanations, all completely within your local environment.

Overview

This project is designed to enable developers to leverage the power of AI while maintaining privacy. Since it communicates with an LLM server running locally without using external APIs, you can handle sensitive code with peace of mind.

Key Features

  • Local LLM Integration: Supports Ollama (/api/chat) and LM Studio (OpenAI-compatible API).
  • Inline Code Insertion: Insert AI-suggested code at the current cursor position with a single click.
  • Diff View: Review the current code against the AI's suggestions using Visual Studio's standard comparison window, allowing you to verify changes before applying them.
  • File Context Reference: Automatically load project files and provide them as context to the AI by typing @filename.
  • Editor Selection Context: Sending a chat message with code selected in the editor automatically appends that selection to the prompt as context.
  • Chat History Preservation: Conversation history is automatically saved and restored upon the next launch.

Tech Stack

  • Language: C# 10.0+
  • Framework: .NET Framework 4.7.2 (Due to Visual Studio extension constraints)
  • UI: WPF + WebView2 (Microsoft Edge WebView2 Runtime)
  • Frontend: HTML5, CSS3, JavaScript (Vanilla JS)
  • Libraries:
    • Newtonsoft.Json (JSON serialization/deserialization)
    • Microsoft.VisualStudio.SDK (For VS extension development)
  • Supported Backends:
    • Ollama
    • LM Studio (or other OpenAI-compatible servers)

Setup and Usage

  1. Prerequisites:
    • Visual Studio 2022 or later.
    • Ollama or LM Studio must be installed and running.
  2. Configuration:
    • In Visual Studio, go to Tools > Options > Continue > General to configure the Server Base URL and the Model Name.
  3. Starting a Chat:
    • Open the chat window via View > Other Windows > Local AI Assistant.
    • Click the "Connect" button to establish a connection with the server.
  4. Using File References:
    • Type @ in the input field to display a list of files within the solution. Selecting a file will provide its content to the AI.

Project Structure

  • ChatWindowControl.xaml.cs: Main UI logic and WebView2 management.
  • LlmClient.cs: Abstracts communication with the local LLM server.
  • EditorHelper.cs: Encapsulates Visual Studio editor operations (insertion, formatting, diff view).
  • Resources/: HTML, CSS, and JavaScript files that constitute the Chat UI.

GitHub

https://github.com/Tatsunobu-Eto/LocalAIContinueVS

License

MIT License

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft