LMAgentChat VS Code Extension
LMAgentChat is a VS Code extension for local and remote coding assistance. It provides a custom chat panel, optional native chat participant integration, provider-backed inline completions, workspace-aware context injection, and a small planner/executor agent loop.
What It Does
- Multi-provider chat with LM Studio, Ollama, and OpenAI-compatible endpoints.
- Two chat surfaces:
- A custom webview chat panel and sidebar view.
- A native VS Code chat participant when the host build supports the chat API, with fallback to the custom panel.
- Chat modes in the custom panel: Ask, Plan, Build, and Agent.
- Inline code completions backed by the selected provider instead of placeholder suggestions.
- Workspace-aware prompt enrichment through mention tokens, attachments, diagnostics, and codebase retrieval.
- A separate
Run Agent Task command that runs a planner/executor loop over structured actions.
- LM Studio MCP integration passthrough through
lmagentchat.lmstudio.integrations.
Core Features
Chat UI
- Open the custom chat panel with
LMAgentChat: Open Chat.
- Focus the sidebar chat view with
LMAgentChat: Focus Sidebar Chat.
- Open the settings webview with
LMAgentChat: Open Settings Page.
- Use the chat composer to:
- upload file attachments,
- attach currently open editors,
- insert mention tokens from the context picker,
- stream responses with stop support,
- browse and rename chat sessions,
- export or clear the current session.
Native Chat
LMAgentChat: Open Native Chat opens VS Code native chat when available.
- If native chat APIs are unavailable in the host build, the command falls back to the custom chat panel.
- The native chat participant supports
/explain, /fix, /test, and /doc request styles.
Inline Completions
- Inline completions are enabled by
lmagentchat.enableAutocomplete.
- Requests are throttled, cached, and built from:
- current file context,
- visible editors and tabs,
- prefix and suffix code windows.
Context Providers
The custom chat input supports mention tokens. Tokens are resolved into prompt context and stripped from the user-facing request text before inference.
@currentFile: active editor content.
@open: open files from visible editors and tabs.
@open:N: open files with an explicit cap.
@tree: workspace tree snapshot.
@tree:query: narrowed tree context.
@problems: diagnostics from the active file.
@problems:workspace: diagnostics across the workspace.
@terminal: active terminal metadata.
@prompt and @rules: prompt/rules-style files in the workspace.
@codebase:query: ranked snippet retrieval from a lightweight local codebase index.
Sessions And Storage
- Chat sessions are persisted under the extension global storage path.
- The current build uses JSON-backed storage for active persistence.
chat.storageBackend exposes json and sqlite, but SQLite is currently forced back to JSON at runtime.
- Older chat messages can be archived and summarized based on these settings:
lmagentchat.chat.memoryBufferSize
lmagentchat.chat.compactionThreshold
lmagentchat.chat.enableAutoSummarize
Agent Capabilities
There are two related agent surfaces:
Agent mode in the custom chat panel.
LMAgentChat: Run Agent Task, which runs the standalone planner/executor loop.
The standalone loop can plan and execute structured actions such as:
read_file
write_file
search
run_command
refactor_code
summarize
ask_user
done
run_command is gated by lmagentchat.agent.allowRunCommands.
Command Surface
The extension contributes the following command groups.
Chat And Settings
Activate Extension
Open Chat
Open Native Chat
Focus Chat
Focus Sidebar Chat
Open Settings Page
Run Agent Task
Refactoring And Analysis
Refactor Code
Explain Code
Simplify Code
Add Type Annotations
Optimize Imports
Analyze Code
Find Bugs
Optimize Code
Security Audit
Performance Review
Testing
Generate Tests
Generate Unit Tests
Generate Integration Tests
Generate E2E Tests
Generate Mocks
Generate Mock Data
Generate Fixtures
Run Tests
Debug Test
Coverage Report
Fix Failing Tests
Documentation, Context, And Git
Generate Documentation
Generate README
Update Documentation
Generate Changelog
Export to Markdown
Show Context
Update Context
Clear Context
Export Context
Show Git Context
Analyze Recent Commits
Branch Overview
Analyze Git Diff
Configuration
Common settings:
{
"lmagentchat.provider": "lmstudio",
"lmagentchat.lmstudio.url": "http://localhost:1234",
"lmagentchat.lmstudio.apiKey": "",
"lmagentchat.lmstudio.integrations": [],
"lmagentchat.lmstudio.autoSelectModel": true,
"lmagentchat.lmstudio.currentModel": "",
"lmagentchat.ollama.url": "http://localhost:11434",
"lmagentchat.ollama.currentModel": "",
"lmagentchat.openai.url": "https://api.openai.com",
"lmagentchat.openai.apiKey": "",
"lmagentchat.openai.currentModel": "",
"lmagentchat.enableAutocomplete": true,
"lmagentchat.agent.maxSteps": 8,
"lmagentchat.agent.allowRunCommands": false,
"lmagentchat.agentMode.requireCompatibleModel": false,
"lmagentchat.chat.memoryBufferSize": 20,
"lmagentchat.chat.compactionThreshold": 40,
"lmagentchat.chat.enableAutoSummarize": true,
"lmagentchat.chat.maxRenderedMessages": 120,
"lmagentchat.chat.storageBackend": "json"
}
LM Studio MCP Integrations
lmagentchat.lmstudio.integrations is passed through to LM Studio chat requests as the integrations array.
Example values:
["mcp/playwright"]
[
{
"type": "ephemeral_mcp",
"server_label": "huggingface",
"server_url": "https://huggingface.co/mcp"
}
]
The extension validates that this setting is a JSON array before saving it from the settings panel.
Development
Install dependencies and compile:
npm install
npm run compile
During development:
npm run watch
Run tests:
npm test
Project Notes
- The settings page is a custom webview and includes built-in setup and command documentation.
- The chat panel uses a windowed renderer for long conversations to avoid mounting the full DOM history.
- The floating scroll-to-bottom button appears when the conversation is no longer near the latest message.
Additional Docs
License
MIT. See LICENSE.