Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>LLM API Proxy Chat ProviderNew to Visual Studio Code? Get it now.
LLM API Proxy Chat Provider

LLM API Proxy Chat Provider

Menno van Leeuwen

|
1 install
| (0) | Free
Integrates your local LLM API Proxy into VS Code Copilot Chat with dynamic model discovery
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LLM API Proxy — VS Code Chat Provider

Use your LLM API Proxy directly in VS Code's Copilot Chat. Models are discovered automatically — no hardcoding required.

Quick Start

  1. Ensure LLM API Proxy is running
  2. Install this extension
  3. Run command "Manage LLM API Proxy Connection" to set your API key
  4. Open Chat (Ctrl/Cmd + Alt + I) and select a model from the picker

Installation

From VS Code Marketplace

(Coming soon)

From Source

git clone https://github.com/vleeuwenmenno/llmapiproxy
cd llmapiproxy/llmapiproxy-vscode-extension
npm install
npm run compile
npm run package
code --install-extension llmapiproxy-vscode-chat-*.vsix

Configuration

Setting Default Description
llmapiproxy.proxyUrl http://localhost:8000 Your proxy server URL

Commands

Command Description
Manage LLM API Proxy Connection Set/clear API key, change proxy URL
Refresh Available Models Force refresh of model list

Documentation

  • Setup Guide — Detailed installation and configuration
  • Development — Building from source, debugging

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft