Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>DeepSeek V4 for Copilot ChatNew to Visual Studio Code? Get it now.
DeepSeek V4 for Copilot Chat

DeepSeek V4 for Copilot Chat

Vizards

|
15 installs
| (0) | Free
Pick DeepSeek V4 Pro & Flash from the Copilot Chat model picker. Vision, thinking mode, agent tools — zero config, BYOK.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

DeepSeek V4 for Copilot Chat

Install in VS Code Version Installs

Pick DeepSeek V4 from the Copilot Chat model picker — and keep everything else Copilot already gives you.

Love DeepSeek's price-performance but don't want to give up GitHub Copilot's agent mode, tool calling, and polished UI? This extension drops DeepSeek V4 Pro & Flash straight into the Copilot Chat model selector — with vision, thinking mode, and your own API key.

Why this extension?

  • Don't replace Copilot — power it up. No new sidebar, no new chat UI to learn. Just a new model in the picker you already use.
  • Agent mode, tool calling, instructions, MCP, skills — all of it still works. Copilot's entire stack, now running on DeepSeek.
  • Vision on a text-only model. DeepSeek V4 can't see images. This extension proxies any image you drop into chat through another Copilot model you already have, then feeds the description to DeepSeek — transparently.
  • BYOK, pay DeepSeek directly. Your API key, your bill, your rate limits. Stored in the OS keychain, never on disk.

Features

DeepSeek V4 Pro & Flash in the model picker

Both models show up alongside GPT-4o, Claude, and friends in Copilot Chat's model selector. 1M token context on both. Switch models mid-chat without losing history.

Transparent Vision Proxy

DeepSeek V4 is text-only. Drop a screenshot into chat and this extension automatically hands the image to another installed Copilot model (Claude, GPT-4o, whatever you've got), gets a description, and feeds that back to DeepSeek. Zero config — just pick your preferred vision model once.

Thinking Mode with Reasoning Effort Control

Full support for DeepSeek V4's reasoning_content. Toggle thinking on/off per your needs — on for complex refactors, off to save tokens on quick edits. Choose between high (balanced) and max (deep reasoning for hard agent tasks).

Inherits Every Copilot Capability

Because this plugs into Copilot's native provider API, you get the full stack for free:

  • Agent mode — autonomous multi-step tasks
  • Tool calling — file edits, terminal, workspace search, Git, tests
  • Instructions & skills — all your .instructions.md, AGENTS.md, and skills just work
  • Prompt caching stats — DeepSeek's cache hit rate logged in the output channel so you can see the savings

Secure by Default

API key lives in VS Code's SecretStorage (OS keychain on macOS / Windows / Linux). Never in settings.json, never in your Git history.

Zero Runtime Dependencies

Pure VS Code API + Node.js built-ins. No Python, no Docker, no local proxy server to babysit.

Getting Started

Prerequisites

  • VS Code 1.104 or later
  • GitHub Copilot subscription (Free / Pro / Enterprise — the free tier works)
  • DeepSeek API key from platform.deepseek.com

Usage

  1. Install from the VS Code Marketplace
  2. Run DeepSeek: Set API Key from the Command Palette (Cmd+Shift+P)
  3. Paste your key (starts with sk-)
  4. Open Copilot Chat, click the model picker, pick DeepSeek V4 Pro or DeepSeek V4 Flash
  5. That's it — chat away

Models

Model Best For
DeepSeek V4 Flash Fast everyday coding, quick edits, cheap iteration
DeepSeek V4 Pro Complex refactors, agent tasks, deep reasoning

Both support thinking mode, tool calling, and 1M token context.

Settings

Setting Default Description
deepseek-copilot.baseUrl https://api.deepseek.com API endpoint — change for self-hosted / proxied deployments
deepseek-copilot.thinking true Enable thinking mode (chain-of-thought). Disable for simple tasks to save tokens
deepseek-copilot.thinkingEffort high high for most tasks, max for complex agent work
deepseek-copilot.maxTokens 0 Max output tokens (0 = no limit). Useful for cost control
deepseek-copilot.visionModel (auto) Which Copilot model to proxy images through
deepseek-copilot.visionPrompt (built-in) Prompt used to describe image attachments

Compared to alternatives

This extension Local proxy (e.g. LiteLLM) Standalone DeepSeek extensions
Works inside Copilot Chat ✅ ✅ ❌ separate UI
Agent mode, tools, skills ✅ ✅ ⚠️ reimplemented
Vision support ✅ proxied ❌ ❌
Extra process to run ❌ ✅ ❌
One-click install ✅ ❌ ✅
API key in OS keychain ✅ ❌ ⚠️ varies

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft