Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>NLPilotNew to Visual Studio Code? Get it now.
NLPilot

NLPilot

Nael Studio

|
1 install
| (0) | Free
GitHub Copilot-style AI coding assistant powered by nlpilot CLI
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

nlpilot — VS Code Extension

A GitHub Copilot-style AI coding assistant for Visual Studio Code, powered by the nlpilot CLI. Brings chat, inline chat, ghost-text completions, and smart code actions directly into the editor — all backed by the same models and sessions as the CLI.


Requirements

  • Visual Studio Code 1.91.0 or later
  • Sign in with nlpilot: Sign In after installing the extension

Installation

From source

cd nlpilot-vscode-extension
bun install
cd webview && bun install && cd ..

# Build both extension host and webview
bun run build

# Launch in Extension Development Host
# Press F5 in VS Code with this folder open

Package as .vsix

bun run package
# Outputs: nlpilot-vscode-0.0.1.vsix

Install the .vsix via Extensions: Install from VSIX… in the Command Palette.


Getting Started

  1. Sign in — open the Command Palette (Ctrl+Shift+P) and run nlpilot: Sign In, or run nlpilot login in a terminal
  2. Open the chat panel — press Ctrl+Alt+C or click the nlpilot icon in the Activity Bar
  3. Start chatting — type a message and press Enter

If you haven't run nlpilot login yet the extension shows a "Sign In" notification and surfaces an error in the chat panel.


Features

Chat Panel (Sidebar)

A full React-based chat interface in the VS Code side panel, backed by the nlpilot CLI engine.

  • Streaming token-by-token responses
  • Markdown rendering with syntax-highlighted code blocks (via streamdown + shiki)
  • Full list/heading/table rendering via @tailwindcss/typography prose styles
  • Tool-call chips showing which tools the agent invoked
  • Cancel in-flight requests with the stop button
  • Auto-scroll with a scroll-to-bottom button when you're reading history
  • Context pill showing the active file or selected line range
  • Model badge in the footer showing the active model
  • New Chat button to start a fresh session

Open with: Ctrl+Alt+C / Command Palette → nlpilot: Open nlpilot Chat


Native Chat Participant (@nlpilot)

nlpilot registers as a participant in VS Code's built-in Chat panel (alongside GitHub Copilot).

@nlpilot explain the authentication flow
@nlpilot /fix there's a type error on line 42
@nlpilot /tests cover all edge cases

Slash commands

Command Description
/explain Explain the selected code or the current file
/fix Diagnose and apply a fix, passing current diagnostics as context
/tests Generate unit tests for the selection or file
/docs Add JSDoc / docstring documentation to the selection
/refactor Suggest and explain a refactor
/new Start a new conversation (reset session)
/model Open a model picker and update the active model
/compact Summarise the conversation history to save context tokens

Context variables

Attach context directly in the chat input:

Variable Description
#file Attach the contents of a file (capped at 120 KB)
#selection Attach the current editor selection with line range
#codebase Attach the workspace file tree + open file list
#problems Attach active diagnostics (errors and warnings)

Follow-up suggestions

After each response, the participant returns 2–3 contextual follow-up prompts you can click to continue the conversation.


Inline Chat (Cmd+I / Ctrl+I)

A floating input that appears directly inside the editor, similar to Copilot's inline chat.

  • Select code and press Cmd+I / Ctrl+I to open
  • Captures the selection and ±50 lines of surrounding context
  • Streams the response diff back as text-edit replacements
  • Accept / Reject / Retry controls in the inline widget
  • /fix shorthand — automatically triggered on diagnostic squiggles with Diagnose and fix with nlpilot quick action
  • /edit free-form instruction — applies the agent's diff to the selection

Ghost-Text Inline Completions

Tab-completion ghost text as you type, similar to Copilot's completions.

  • Works in all languages (*)
  • 300 ms debounce after each keystroke before triggering
  • Builds a prompt from: file language, path, last 100 lines (prefix), next 20 lines (suffix)
  • Always uses the fastest/cheapest model (claude-3-haiku) to keep latency low
  • LRU cache of the last 20 completions keyed by prefix hash
  • "Generating…" spinner in the status bar while a request is in flight

Toggle with: Command Palette → nlpilot: Toggle Inline Completions


Smart Code Actions (Lightbulb / Right-click)

The extension registers code actions for all files that appear in the lightbulb menu and the right-click context menu.

Action Kind Trigger
Fix with nlpilot QuickFix (isPreferred: true) Diagnostic squiggle
Explain Selection (empty) Text selected
Add Docs RefactorRewrite Text selected
Generate Tests (empty) Text selected

Editor title toolbar also shows ⚡ buttons to open Inline Chat and fix diagnostics.


Session Management

Sessions are stored at ~/.nlpilot/sessions/<cwdHash>/ — the same files the CLI uses, so history is shared between terminal and editor.

Sessions tree view

The Sessions panel in the nlpilot Activity Bar sidebar shows all sessions for the current workspace:

  • Click a session to load it into the chat panel
  • Rename sessions with the pencil icon
  • Delete sessions with the trash icon
  • Refresh the list with the refresh icon

New Chat

Click the + button in the chat panel toolbar or run nlpilot: New Chat Session to start a fresh session. This posts a NEW_SESSION message to the webview and calls sessionStore.newSession().

Compact

Click the compact button or use @nlpilot /compact to summarise the conversation history. The CLI is invoked with /compact and the result replaces the messages array with a concise summary.


Settings

All settings live under the nlpilot.* namespace in VS Code settings (Ctrl+,).

Setting Type Default Description
nlpilot.model string — Override the default model
nlpilot.provider openai | anthropic | google — AI provider
nlpilot.completions.enabled boolean true Enable ghost-text inline completions
nlpilot.completions.debounceMs number 300 Debounce delay (ms) before triggering a completion
nlpilot.tools.allowAll boolean false Skip tool approval prompts (autopilot mode)
nlpilot.tools.allowList string[] [] Tools that are always allowed without prompting
nlpilot.tools.denyList string[] [] Tools that are always denied
nlpilot.mcpServers object — MCP server config (mirrors .mcp.json)
nlpilot.instructions string[] [] Additional system instruction file paths

Open settings: Command Palette → nlpilot: Configure nlpilot Settings


Commands

All commands are accessible from the Command Palette (Ctrl+Shift+P) under the nlpilot category.

Command Default Keybinding Description
nlpilot: Open nlpilot Chat Ctrl+Alt+C Open the chat panel
nlpilot: Start Inline Chat Cmd+I / Ctrl+I Start inline chat in the editor
nlpilot: Explain Selection — Explain the current selection
nlpilot: Fix with nlpilot — Fix diagnostics in the current file
nlpilot: Generate Tests — Generate unit tests for the selection
nlpilot: Add Docs — Add documentation to the selection
nlpilot: New Chat Session — Start a fresh conversation
nlpilot: Sign In — Run the login flow
nlpilot: Sign Out — Clear stored credentials
nlpilot: Select Model — Open model picker and update active model
nlpilot: Toggle Inline Completions — Toggle ghost-text completions on/off
nlpilot: Compact Conversation — Summarise conversation history
nlpilot: Configure nlpilot Settings — Open VS Code settings filtered to nlpilot

Status Bar

The status bar (right side) shows:

  • Active model name — click to open the model picker
  • Spinner while a completion or chat response is in flight
  • "Not signed in" warning when credentials are missing

Local Telemetry

nlpilot logs usage locally to ~/.nlpilot/stats.json — no data leaves your machine. Logged metrics include session counts, tool-call counts, and error types. Telemetry is automatically disabled when VS Code's telemetry.telemetryLevel is set to off.


Architecture

nlpilot-vscode-extension/
├── src/
│   ├── extension.ts                # Activation entry point
│   ├── agent/
│   │   └── NlpilotParticipant.ts   # @nlpilot chat participant + slash commands
│   ├── bridge/
│   │   ├── NlpilotProcess.ts       # Spawns CLI as a child process + retry logic
│   │   ├── UIMessageStreamAdapter.ts  # CLI NDJSON → ReadableStream<UIMessageChunk>
│   │   ├── PostMessageStreamAdapter.ts # Extension → webview postMessage bridge
│   │   ├── CredentialsBridge.ts    # Reads ~/.nlpilot/credentials
│   │   └── types.ts                # SpawnOptions, NlpilotBridgeChunk types
│   ├── completion/
│   │   └── InlineCompletionProvider.ts  # Ghost-text completions
│   ├── context/
│   │   └── ContextResolver.ts      # #file, #selection, #codebase, #problems resolvers
│   ├── inline/
│   │   └── InlineChatHandler.ts    # Cmd+I inline chat
│   ├── actions/
│   │   └── NlpilotCodeActionsProvider.ts  # Fix/Explain/Tests/Docs lightbulb actions
│   ├── session/
│   │   ├── SessionStore.ts         # Session read/write (~/.nlpilot/sessions/)
│   │   └── SessionTreeProvider.ts  # Sessions sidebar tree view
│   ├── status/
│   │   └── StatusBarManager.ts     # Model name + spinner status bar item
│   ├── telemetry/
│   │   └── LocalTelemetry.ts       # Local-only stats logging
│   └── views/
│       ├── ChatWebviewProvider.ts  # Sidebar webview panel host
│       └── SidebarStubs.ts         # MCP + placeholder tree views
└── webview/                        # React + Vite UI
    └── src/
        ├── App.tsx                 # Root component, useChat wiring
        ├── main.tsx                # Webview entry point
        ├── index.css               # Tailwind v4 + VS Code theme CSS variables
        ├── lib/
        │   ├── utils.ts            # cn() helper
        │   └── vscode.ts           # acquireVsCodeApi() singleton
        └── components/
            ├── ContextPill.tsx     # Active file / selection badge above input
            ├── ModelBadge.tsx      # Active model name in footer
            ├── ModeSelector.tsx    # Ask / Plan / Autopilot toggle (deferred)
            ├── CommandMenu.tsx     # Slash-command quick pick in input
            ├── ToolCallChip.tsx    # Tool invocation chip in message stream
            └── ai-elements/        # AI chat UI components
                ├── conversation.tsx  # StickToBottom scroll container
                ├── message.tsx       # Message bubble + Streamdown renderer
                ├── prompt-input.tsx  # Input area with submit / cancel
                ├── code-block.tsx    # Syntax-highlighted code block
                ├── reasoning.tsx     # Collapsible thinking / reasoning block
                ├── tool.tsx          # Tool-call + result accordion
                ├── confirmation.tsx  # Tool approval modal
                └── ...

Communication flow

User input (webview)
  │  postMessage SEND_MESSAGE
  ▼
ChatWebviewProvider (extension host)
  │  spawnWithRetry(opts) → NlpilotProcess
  ▼
nlpilot CLI child process
  │  NDJSON lines on stdout
  ▼
UIMessageStreamAdapter → ReadableStream<UIMessageChunk>
  │  postMessage NLPILOT_CHUNK (per frame)
  ▼
Webview VsCodePostMessageTransport → useChat (AI SDK React)
  │  incremental state updates
  ▼
Rendered conversation (React)

Shared State with the CLI

The extension reads and writes the same files as the CLI so both stay in sync:

File Description
~/.nlpilot/credentials API key, provider, model, optional custom endpoint
~/.nlpilot/sessions/<hash>/ Conversation history
~/.nlpilot/mcp.json Global MCP server config
.mcp.json (project root) Project-level MCP server config

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft