A Visual Studio extension that enables direct use of local LLMs (such as Ollama and LM Studio).
It allows code generation, refactoring, and explanations to be completed entirely in a local environment while preserving privacy.
Here is the English translation formatted in Markdown, ready to be used in a README.
LocalAIContinueVS
LocalAIContinueVS is a Visual Studio extension that allows you to use local LLMs (such as Ollama, LM Studio, etc.) directly within the IDE. Through a chat interface, you can generate code, refactor, and get explanations, all completely within your local environment.
Overview
This project is designed to enable developers to leverage the power of AI while maintaining privacy. Since it communicates with an LLM server running locally without using external APIs, you can handle sensitive code with peace of mind.
Key Features
Local LLM Integration: Supports Ollama (/api/chat) and LM Studio (OpenAI-compatible API).
Inline Code Insertion: Insert AI-suggested code at the current cursor position with a single click.
Diff View: Review the current code against the AI's suggestions using Visual Studio's standard comparison window, allowing you to verify changes before applying them.
File Context Reference: Automatically load project files and provide them as context to the AI by typing @filename.
Editor Selection Context: Sending a chat message with code selected in the editor automatically appends that selection to the prompt as context.
Chat History Preservation: Conversation history is automatically saved and restored upon the next launch.
Tech Stack
Language: C# 10.0+
Framework: .NET Framework 4.7.2 (Due to Visual Studio extension constraints)