AWS Bedrock Models for GitHub Copilot Chat (VS Code Extension)

Use AWS Bedrock models directly in GitHub Copilot Chat, including Claude, Llama, Mistral, Qwen, and more.
- Keep code and prompts in your AWS account for stronger governance
- Choose your AWS region to align with residency and compliance requirements
- Streaming + tool calling for responsive coding workflows
- Multi-region support across 12 AWS regions
Why This Extension
- Compliance-first architecture: prompts, code context, and responses stay within your AWS account boundary.
- Data residency control: select the AWS region your team is allowed to use and keep traffic there.
- Enterprise-ready access model: works with existing AWS credentials, profiles, and IAM controls.
- No model lock-in: use multiple Bedrock model families from one Copilot Chat workflow.
- Built for developer UX: streaming responses, tool calling, and model switching in the standard chat UI.
Supported Model Families
OpenAI
gpt-oss-20b, gpt-oss-120b
- Safeguard variants:
gpt-oss-safeguard-20b/120b
Google
- Gemma 3:
4b, 12b, 27b variants
Mistral
magistral-small-2509
mistral-large-3-675b-instruct
- Ministral:
3b, 8b, 14b variants
- Voxtral:
mini-3b, small-24b variants
Qwen
- General:
qwen3-32b, qwen3-235b, qwen3-next-80b
- Vision:
qwen3-vl-235b (multimodal)
- Coding:
qwen3-coder-30b/480b
DeepSeek
Nvidia
nemotron-nano-9b-v2, nemotron-nano-12b-v2
Others
- MoonshotAI:
kimi-k2-thinking
- Minimax:
minimax-m2
- ZAI:
glm-4.6
Prerequisites
Authentication options:
- API key mode (optional):
- AWS credentials mode (optional):
- Use AWS credentials/profile available to VS Code (env vars,
~/.aws/credentials, SSO, etc)
- You can also set
aws-bedrock.awsProfile
- VS Code: Version 1.104.0 or later
Installation
From VS Code Marketplace
- Open VS Code
- Go to Extensions (Cmd+Shift+X)
- Search for "Bedrock LLMs for GitHub Copilot Chat"
- Click Install
From Source
Clone this repository:
git clone https://github.com/easytocloud/bedrock-vscode-chat.git
cd bedrock-vscode-chat
Install dependencies:
npm install
Compile the extension:
npm run compile
Press F5 to open a new VS Code window with the extension loaded
Setup
The extension supports two authentication methods:
Option A: API Key (Simpler)
Via Command Palette:
- Open Command Palette (
Cmd+Shift+P / Ctrl+Shift+P)
- Run:
Manage AWS Bedrock
- Select "Enter API Key (Mantle)"
- Paste your API key from AWS Bedrock Console
On First Use:
- The extension will prompt for your API key when required
- Your key is stored securely in VS Code's SecretStorage
Option B: AWS Credentials (Better for existing AWS setups)
- Open Command Palette
- Run:
Manage AWS Bedrock
- Select "Configure Mantle Authentication"
- Choose "AWS Credentials"
- Optionally set a specific profile via "Set AWS Profile (Mantle)"
This method uses AWS Signature V4 authentication with your existing AWS credentials.
If you want a specific named profile:
- Run:
Manage AWS Bedrock
- Select "Set AWS Profile (Native)"
- Enter a profile name (or leave blank to use the default credential chain)
3. Select Region (Optional)
Default region is us-east-1. To change:
- Open Command Palette
- Run:
Manage AWS Bedrock
- Select "Change Region"
- Choose your preferred AWS region
Or set in Settings:
{
"aws-bedrock.region": "us-west-2",
"aws-bedrock.mantleAuthMethod": "awsCredentials", // or "apiKey"
"aws-bedrock.mantleAwsProfile": "my-profile", // optional
"aws-bedrock.awsProfile": "my-profile" // for native Bedrock
}
Show/hide specialized models (like safeguard variants):
{
"aws-bedrock.showAllModels": true // default: true
}
Usage
Using in Chat
- Open GitHub Copilot Chat (
Cmd+Shift+I / Ctrl+Shift+I)
- Click the model picker (top of chat panel)
- Select an AWS Bedrock model (e.g., "OpenAI GPT OSS 120B")
- Start chatting!
Using with Copilot Chat
- In any editor, use
@workspace or other chat participants
- The model picker will include Bedrock models
- Select a Bedrock model for your conversation
Example Chat
You: What are the key features of Rust's ownership system?
Assistant (via Bedrock): [Streams response in real-time...]
Configuration
Settings
| Setting |
Type |
Default |
Description |
aws-bedrock.region |
string |
us-east-1 |
AWS region for Bedrock requests |
aws-bedrock.enableMantle |
boolean |
true |
Enable models available through API key mode |
aws-bedrock.enableNative |
boolean |
true |
Enable models available through Converse API |
aws-bedrock.mantleAuthMethod |
string |
apiKey |
Auth mode for API key path: apiKey or awsCredentials |
aws-bedrock.mantleAwsProfile |
string |
empty |
Optional AWS profile for API key path when using credentials |
aws-bedrock.awsProfile |
string |
empty |
Optional AWS profile for Converse API path |
aws-bedrock.showAllModels |
boolean |
true |
Show all models including specialized variants |
aws-bedrock.debugLogging |
boolean |
false |
Enable verbose debug logging |
aws-bedrock.sendTools |
boolean |
true |
Send tool definitions to the model |
aws-bedrock.emitPlaceholders |
boolean |
true |
Emit placeholder text while waiting |
aws-bedrock.modelMetadataSource |
string |
litellm |
Metadata source for token/capability info |
aws-bedrock.modelMetadataUrl |
string |
default URL |
External metadata registry URL |
aws-bedrock.modelMetadataCacheHours |
number |
24 |
Cache duration for external metadata |
Note: setting keys and some command labels include mantle naming for backward compatibility.
Supported Regions
us-east-1 (N. Virginia) - Default
us-east-2 (Ohio)
us-west-2 (Oregon)
eu-west-1 (Ireland)
eu-west-2 (London)
eu-central-1 (Frankfurt)
eu-north-1 (Stockholm)
eu-south-1 (Milan)
ap-south-1 (Mumbai)
ap-northeast-1 (Tokyo)
ap-southeast-3 (Jakarta)
sa-east-1 (São Paulo)
Commands
| Command |
Description |
Manage AWS Bedrock |
Configure authentication, AWS profile, region, and settings |
Clear AWS Bedrock API Key (Mantle) |
Remove stored AWS Bedrock API key |
Show AWS Bedrock Logs |
Open the extension output channel |
Architecture
This extension implements VS Code's LanguageModelChatProvider interface using AWS Bedrock APIs.
Key Components
- BedrockMantleProvider: Main provider implementing the GitHub Copilot Chat provider interface
- Dynamic Model Discovery: Fetches available model catalogs from AWS Bedrock APIs
- Streaming Support: Processes SSE (Server-Sent Events) for real-time responses
- Tool Calling: Buffers and parses streaming tool calls for function calling support
https://bedrock-mantle.<region>.api.aws/v1
Model Capabilities
Models with function calling capabilities:
gpt-oss-120b
mistral-large-3-675b-instruct
magistral-small-2509
deepseek.v3.1
qwen3-235b and larger models
qwen3-vl-235b (vision + tools)
Vision Support
Models with multimodal (image) input:
- Models from API-key mode: based on model naming and API behavior
- Models from Converse API mode: based on Bedrock's reported input modalities
- Token limits + initial capabilities: The extension can optionally use an external model metadata registry (default: Litellm's public JSON) to populate
maxInputTokens, maxOutputTokens, and initial tool/vision flags. Configure via aws-bedrock.modelMetadataSource, aws-bedrock.modelMetadataUrl, and aws-bedrock.modelMetadataCacheHours.
- Converse API models: vision is derived from
ListFoundationModels input modalities (reliable). Tool support is verified on-demand by attempting a tool-enabled request and caching whether the model accepts tool config (this overrides external metadata if runtime behavior differs).
- API-key catalog models:
/v1/models does not include full tool/vision/token metadata, so the extension uses external metadata when enabled, plus runtime probing (tools) as a safety net.
Code Specialization
Models optimized for coding:
qwen3-coder-30b-a3b-instruct
qwen3-coder-480b-a35b-instruct
Reasoning/Thinking
Models with enhanced reasoning:
Troubleshooting
API Key Issues
Problem: "Invalid API key" error
Solution:
- Verify your API key in AWS Bedrock Console
- Run:
Manage AWS Bedrock → "Clear API Key (Mantle)"
- Re-enter your API key
Model Not Available
Problem: "Model not available in region" error
Solution:
Rate Limiting
Problem: "Rate limit exceeded" error
Solution:
- Wait a few moments and try again
- Consider using smaller models for testing
- Check your AWS Bedrock quotas in AWS Console
Connection Issues
Problem: Network or timeout errors
Solution:
- Check your internet connection
- Verify firewall/proxy settings allow access to
*.api.aws
- Ensure the selected region is accessible from your location
Development
Building from Source
# Install dependencies
npm install
# Compile TypeScript
npm run compile
# Watch mode for development
npm run watch
# Run linting
npm run lint
Or use the Makefile shortcuts:
make install
make compile
make watch
make lint
Debugging
- Open the project in VS Code
- Press
F5 to launch Extension Development Host
- Set breakpoints in source files
- Test the extension in the new window
Project Structure
bedrock-vscode-chat/
├── src/
│ ├── extension.ts # Extension entry point
│ ├── provider.ts # Main provider implementation
│ ├── bedrockNative.ts # Native Bedrock Converse API
│ ├── externalModelMetadata.ts # External model metadata loader
│ ├── types.ts # TypeScript type definitions
│ └── utils.ts # Utility functions
├── package.json # Extension manifest
├── tsconfig.json # TypeScript configuration
├── icon.svg # Source icon (editable)
├── icon.png # Extension icon (128x128)
├── README.md # This file
├── CONTRIBUTING.md # Development guide
└── PLAN.md # Architecture details
Contributing
Contributions are welcome! See CONTRIBUTING.md for detailed development guidelines.
Quick start for contributors:
- Fork and clone the repository
- Install dependencies:
npm install
- Compile:
npm run compile
- Press F5 to launch Extension Development Host
- See CONTRIBUTING.md for testing, logging, and publishing guidelines
Key development notes:
- Publisher name:
easytocloud (lowercase)
- Use Output Channel for logging, not console.log
- Include node_modules in VSIX (required for AWS SDK)
- Test in both F5 mode and installed VSIX
- Use
rsvg-convert for icon generation
Resources
License
MIT License - See LICENSE file for details
Credits
- Project Lead: easytocloud
- Development Assistant: GitHub Copilot
Acknowledgments
Inspired by the HuggingFace extension for GitHub Copilot Chat.
Support
Version: 0.3.1
Status: Production
Last Updated: February 5, 2026