Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>Agentic LSP - LumenNew to Visual Studio Code? Get it now.
Agentic LSP - Lumen

Agentic LSP - Lumen

Maaz Siddiqi

| (0) | Free
Thin VS Code wrapper for the Lumen language server.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Lumen VS Code

Lumen is an experimental semantic language server for VS Code. It uses an AI agent to read the current file and publish diagnostics in the Problems panel.

This extension packages the Lumen language server and launches it with VS Code's built-in Node runtime.

Requirements

  • VS Code 1.95 or newer.
  • opencode installed, configured, and available on your PATH.
  • A workspace folder open in VS Code.

Lumen still uses opencode for model access and codebase reads. If opencode cannot start, Lumen diagnostics will not appear.

Install opencode

Install opencode before using Lumen. The official installation docs are at https://opencode.ai/en/docs.

Recommended install script:

curl -fsSL https://opencode.ai/install | bash

Alternative install options:

npm install -g opencode-ai
brew install anomalyco/tap/opencode

After installing, verify that VS Code can find it:

opencode --version

Then configure opencode with the model provider and API key you want to use. If you launch VS Code from a desktop app and Lumen cannot find opencode, try launching VS Code from a terminal where opencode --version works.

Getting Started

  1. Install the Lumen VSIX.
  2. Open a project folder in VS Code.
  3. Open a source file.
  4. Wait a few seconds for Lumen to analyze the file.
  5. Check the Problems panel for diagnostics with source lumen.

Lumen runs automatically after VS Code starts. It analyzes opened files and reanalyzes them after edits with a short debounce.

What To Expect

  • Diagnostics may take several seconds because they come from an AI agent.
  • Lumen may report semantic issues that normal language servers miss.
  • Lumen may also miss issues or produce diagnostics that need human judgment.
  • Results are cached per file content for a short alpha-friendly feedback loop.
  • Some diagnostics include quick fixes. Use VS Code's Quick Fix action to apply one.

Treat Lumen as a review assistant, not a compiler. The generated quick fixes should be reviewed before committing.

Commands

  • Lumen: Restart Language Server restarts the bundled language server.

Use this if diagnostics stop updating, opencode was reconfigured, or you changed Lumen settings.

Troubleshooting

Open the Lumen Output channel in VS Code if diagnostics do not appear.

Common issues:

  • opencode is not installed or is not on the PATH visible to VS Code.
  • opencode is installed but not configured for a model provider.
  • The workspace has no folder open, so Lumen falls back to the extension folder as its working directory.
  • Ports 4317 or 4318 are already in use.
  • The selected file is large, generated, or not useful for semantic analysis.

After fixing setup issues, run Lumen: Restart Language Server.

Advanced Settings

Use the lumen.server.* settings only if you need a custom server runtime, cwd, args, or environment.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft