BYOK for VSCode Copilot Chat
A BYOK extension for VSCode Copilot Chat.
Bring third-party model providers into VS Code Copilot natively. The extension supports both OpenAI-compatible and Anthropic-compatible services.
Features
- Supports two model providers:
- OpenAI-compatible
- Anthropic-compatible
- Supports custom API keys and endpoints
- Supports custom model lists with capabilities, context limits, and tool calling / vision / thinking flags
- Feels and behaves like native Copilot as closely as possible
How to Use
Add your model configuration:
- Open model picker -> Add Model -> OpenAI-compatible/Anthropic-compatible -> Follow Instruction

Then choose your custom model from the model picker:

You can start chatting right away after that. 😊
Configuration Example
{
"name": "example",
"vendor": "byok-anthropic",
"apiKey": "${input:chat.lm.secret.-3b3a9fe3}",
"endpoint": "https://api.example.org/",
"models": [
{
"id": "claude-opus-4-6",
"name": "Claude Opus 4.6",
"vision": true,
"toolCalling": true,
"thinking": true,
"adaptiveThinking": true,
"supportsReasoningEffort": [
"low",
"medium",
"high",
"max"
],
"maxInputTokens": 872000,
"maxOutputTokens": 128000
}
]
}
Field descriptions for models:
id: The model identifier, such as claude-opus-4-7 or gpt-5.5
name: The display name shown in the model picker
vision: Whether the model supports vision input
toolCalling: Whether the model supports tool calling
thinking: Whether the model supports thinking mode
Other fields are only needed when you are defining a custom model.
adaptiveThinking: Whether the model supports adaptive thinking
supportsReasoningEffort: Supported reasoning effort levels, if applicable
maxInputTokens: Maximum number of input tokens
maxOutputTokens: Maximum number of output tokens
Acknowledgements
This extension was inspired by the DeepSeek V4 for Copilot Chat extension: https://marketplace.visualstudio.com/items?itemName=Vizards.deepseek-v4-for-copilot