Skip to content
| Marketplace
Sign in
Visual Studio>Tools>LM Local
LM Local

LM Local

Aleksandrs Kornevs

|
99 installs
| (0) | Free
Lightweight local AI chat for Visual Studio — interactive streaming responses, in-session chat, Markdown rendering, code highlighting, and clipboard support via LM Studio.
Download

LMLocal is a Visual Studio extension that adds a dedicated chat interface for interacting with local LLMs via LM Studio. It operates as a manual assistant for prompts and code generation within the IDE.

Key Features:

LM Studio Connection: Connects to the local server at http://127.0.0.1:1234 by default.

Chat Interface: A standalone tool window for entering prompts and receiving model responses.

Real-time Streaming: Displays text incrementally as tokens are generated by the model.

Thought/Reasoning Support: Displays internal reasoning steps in dedicated expandable blocks.

Live Stats: Monitor performance in the status bar, including real-time speed (tokens/sec) and total token count.

Active Window Context: Use the + button to instantly include the content of your current editor or output pane in the request.

Formatting: Full Markdown support with syntax highlighting and a Copy icon for code blocks.

No Automated Code Access: The extension does not read project files automatically. It only processes information manually entered, pasted, or explicitly shared by the user via the context button.

Local Processing: All data remains on your hardware; no information is sent to external cloud services.
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft