Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>DeepSeek Provider for GitHub Copilot ChatNew to Visual Studio Code? Get it now.
DeepSeek Provider for GitHub Copilot Chat

DeepSeek Provider for GitHub Copilot Chat

Jobs Lee

|
1 install
| (0) | Free
Use DeepSeek models inside GitHub Copilot Chat through VS Code's official Language Model Chat Provider API.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

DeepSeek Provider for GitHub Copilot Chat

DeepSeek Provider for GitHub Copilot Chat brings DeepSeek models into the GitHub Copilot Chat model picker through VS Code's official Language Model Chat Provider API.

This extension does not patch or replace GitHub Copilot. It contributes DeepSeek as a first-class language model provider that works with the existing Copilot Chat experience.

Features

  • Exposes DeepSeek models directly in the GitHub Copilot Chat model selector
  • Connects to DeepSeek through the official OpenAI-compatible API
  • Streams text responses in real time
  • Supports tool calling so Copilot Chat features that depend on tools can keep working
  • Exposes both standard and thinking variants for DeepSeek V4 models
  • Preserves compatibility with the legacy DeepSeek aliases deepseek-chat and deepseek-reasoner

Requirements

  • VS Code 1.104 or newer
  • GitHub Copilot Chat
  • A valid DeepSeek API key

Models contributed through the Language Model Chat Provider API are also subject to GitHub Copilot plan and policy restrictions.

Quick Start

  1. Open the Command Palette.
  2. Run Manage DeepSeek Provider.
  3. Save your DeepSeek API key.
  4. Open GitHub Copilot Chat and pick a DeepSeek model from the model selector.

Settings

The extension contributes the following settings:

  • languageModelChatProvider.deepseek.baseUrl
  • languageModelChatProvider.deepseek.modelIds
  • languageModelChatProvider.deepseek.reasoningEffort

The default API base URL is https://api.deepseek.com.

Model Mapping Strategy

  • deepseek-v4-flash and deepseek-v4-pro are each exposed as two entries: a standard mode entry and a thinking mode entry.
  • DeepSeek V4 enables thinking by default, so the standard entry explicitly sends thinking disabled to keep Copilot Chat behavior predictable.
  • The legacy aliases deepseek-chat and deepseek-reasoner are still shown for compatibility, but requests are mapped onto DeepSeek V4 Flash in standard or thinking mode.

Development

cd extensions/deepseek-copilot-provider
npm install
npm run compile

Press F5 in VS Code to launch an Extension Development Host.

Packaging

cd extensions/deepseek-copilot-provider
npm install
npm run package

The packaged VSIX file is written to the extension root.

Known Limitations

  • DeepSeek thinking mode requires reasoning_content to be passed back after tool-using assistant turns. This extension caches and replays that reasoning payload for tool-call turns, but very aggressive upstream message truncation can still conflict with DeepSeek's API requirements.
  • Image input is not currently declared because the initial release is focused on text and tool-calling stability inside Copilot Chat.
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft