LMLocal is a Visual Studio extension that adds a dedicated chat interface for interacting with local LLMs via LM Studio. It operates as a manual assistant for prompts and code generation within the IDE.
Key Features:
LM Studio Connection: Connects to the local server at http://127.0.0.1:1234 by default.
Chat Interface: A standalone tool window for entering prompts and receiving model responses.
Real-time Streaming: Displays text incrementally as tokens are generated by the model.
Thought/Reasoning Support: Displays internal reasoning steps in dedicated expandable blocks.
Live Stats: Monitor performance in the status bar, including real-time speed (tokens/sec) and total token count.
Active Window Context: Use the + button to instantly include the content of your current editor or output pane in the request.
Formatting: Full Markdown support with syntax highlighting and a Copy icon for code blocks.
No Automated Code Access: The extension does not read project files automatically. It only processes information manually entered, pasted, or explicitly shared by the user via the context button.
Local Processing: All data remains on your hardware; no information is sent to external cloud services.