Skip to content
| Marketplace
Sign in
Visual Studio>Tools>LM Local
LM Local

LM Local

Aleksandrs Kornevs

|
2 installs
| (0) | Free
Lightweight local AI chat for Visual Studio — interactive streaming responses, in-session chat, Markdown rendering, code highlighting, and clipboard support via LM Studio.
Download

LMLocal: Local AI Chat for Visual Studio 2022

LMLocal is a Visual Studio extension that adds a dedicated chat interface for interacting with local LLMs via LM Studio. It operates as a manual assistant for prompts and code generation within the IDE.

Key Features:

LM Studio Connection: Connects to the local server at http://127.0.0.1:1234 by default.

Chat Interface: A standalone tool window for entering prompts and receiving model responses.

Real-time Streaming: Displays text incrementally as tokens are generated by the model.

No Automated Code Access: The extension does not read project files or analyze open source code automatically. It only processes information manually entered or pasted by the user.

Formatting: Full Markdown support and syntax highlighting for code blocks.

Local Processing: All data remains on your hardware; no information is sent to external cloud services.

Requirements: LM Studio must be running with the Local Server enabled on port 1234.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft