DeepSeek Provider for GitHub Copilot Chat brings DeepSeek models into the GitHub Copilot Chat model picker through VS Code's official Language Model Chat Provider API.
This extension does not patch or replace GitHub Copilot. It contributes DeepSeek as a first-class language model provider that works with the existing Copilot Chat experience.
Features
Exposes DeepSeek models directly in the GitHub Copilot Chat model selector
Connects to DeepSeek through the official OpenAI-compatible API
Streams text responses in real time
Supports tool calling so Copilot Chat features that depend on tools can keep working
Exposes both standard and thinking variants for DeepSeek V4 models
Preserves compatibility with the legacy DeepSeek aliases deepseek-chat and deepseek-reasoner
Requirements
VS Code 1.104 or newer
GitHub Copilot Chat
A valid DeepSeek API key
Models contributed through the Language Model Chat Provider API are also subject to GitHub Copilot plan and policy restrictions.
Quick Start
Open the Command Palette.
Run Manage DeepSeek Provider.
Save your DeepSeek API key.
Open GitHub Copilot Chat and pick a DeepSeek model from the model selector.
deepseek-v4-flash and deepseek-v4-pro are each exposed as two entries: a standard mode entry and a thinking mode entry.
DeepSeek V4 enables thinking by default, so the standard entry explicitly sends thinking disabled to keep Copilot Chat behavior predictable.
The legacy aliases deepseek-chat and deepseek-reasoner are still shown for compatibility, but requests are mapped onto DeepSeek V4 Flash in standard or thinking mode.
Development
cd extensions/deepseek-copilot-provider
npm install
npm run compile
Press F5 in VS Code to launch an Extension Development Host.
Packaging
cd extensions/deepseek-copilot-provider
npm install
npm run package
The packaged VSIX file is written to the extension root.
Known Limitations
DeepSeek thinking mode requires reasoning_content to be passed back after tool-using assistant turns. This extension caches and replays that reasoning payload for tool-call turns, but very aggressive upstream message truncation can still conflict with DeepSeek's API requirements.
Image input is not currently declared because the initial release is focused on text and tool-calling stability inside Copilot Chat.