InvAit - Local AI Chat and agent for Visual Studio
Secure Visual Studio 2022/2026 extension with local and private AI agent support.
🔐 Security & Privacy
- Local First: No code leaves your machine by default.
- Private AI: Native support for local LLMs (Ollama, LM Studio, vLLM).
- Control: You define the endpoint and API keys. No hidden telemetry.
✨ Key Features
- Autonomous Agent: Can read files, execute terminal commands, git operations, and apply code changes.
- Integrated Chat: Blazor WebAssembly UI running directly inside Visual Studio.
- Task Management: Built-in system to track and plan development tasks.
- Build Integration: Can trigger builds and analyze compilation errors.
Configuration
Option 1: Local (Recommended)
Use LM Studio or Ollama.
- Endpoint:
http://localhost:11434/api (Ollama default)
- Model:
llama3, mistral, codellama, zai-org/glm-4.7-flash
- Key: (Leave empty)
Option 2: Remote / Self-Hosted
- Endpoint: URL of your OpenAI-compatible provider.
- Key: Your API Key (stored securely).
🛠 Capabilities
| Category |
Tools |
| Files |
Read, Create, Search, Apply Diff |
| Project |
Build, Get Errors, Inspect Structure |
| Git |
Status, Log, Diff, Branch Info |
| System |
Execute Shell Commands, Fetch URLs |
| |