Amazon Bedrock Provider for GitHub Copilot Chat
A VSCode extension that brings Amazon Bedrock models into GitHub Copilot Chat using VSCode's official Language Model Chat Provider API and the AWS SDK.
This is not a hack or workaround - it's built on two official APIs:
- VSCode's Language Model Chat Provider API for integrating custom models into Copilot Chat
- AWS SDK for JavaScript for connecting to Amazon Bedrock
Important: Models provided through the Language Model Chat Provider API are currently only available to users on individual GitHub Copilot plans. Organization plans are not yet supported.
Features
- Native Amazon Bedrock Integration: Access Claude, OpenAI OSS, DeepSeek, and other models directly in GitHub Copilot Chat
- Flexible Authentication: Support for AWS Profiles, API Keys (bearer tokens), or Access Keys - all stored securely
- Streaming Support: Real-time streaming responses for faster feedback
- Function Calling: Full support for tool/function calling capabilities
- Cross-Region Inference: Automatic support for cross-region inference profiles
- Extended Thinking: Automatic support for extended thinking in Claude Opus 4+, Sonnet 4+, and Sonnet 3.7 for enhanced reasoning on complex tasks. Also respects GitHub Copilot's
github.copilot.chat.anthropic.thinking.enabled and github.copilot.chat.anthropic.thinking.maxTokens settings
- Thinking Effort Control: For Claude Opus 4.5, configure thinking effort level (high/medium/low) via
bedrock.thinking.effort setting to balance quality vs. token usage. Defaults to "high" for maximum capability
- 1M Context Window: Optional 1M token context window for Claude Sonnet 4.x models (can be disabled in settings to reduce costs)
- Prompt Caching: Automatic caching of system prompts, tool definitions, and conversation history for faster responses and reduced costs (Claude and Nova models)
- Vision Support: Work with models that support image inputs
Prerequisites
- Visual Studio Code version 1.104.0 or higher
- GitHub Copilot extension
- AWS credentials (AWS Profile, API Key, or Access Keys)
- Access to Amazon Bedrock in your AWS account
Installation
- Install the extension from the VSCode marketplace
- Configure your AWS credentials if you haven't already:
- Run the "Manage Amazon Bedrock Provider" command to select your AWS profile and region
Configuration
Authentication Methods
This extension supports three authentication methods:
- AWS Profile (recommended) - Uses named profiles from
~/.aws/credentials and ~/.aws/config
- API Key - Uses Amazon Bedrock API key (stored securely in VSCode SecretStorage)
- Access Keys - Uses AWS access key ID and secret (stored securely in VSCode SecretStorage)
To configure:
- Open the Command Palette (
Cmd+Shift+P or Ctrl+Shift+P)
- Run "Manage Amazon Bedrock Provider"
- Choose "Set Authentication Method" to select your preferred method
- Follow the prompts to enter credentials
- Choose "Set Region" to select your preferred AWS region
Available Regions
The extension supports all AWS partitions including:
- Commercial AWS - All standard regions (us-east-1, eu-west-1, ap-southeast-2, etc.)
- AWS GovCloud (US) - us-gov-west-1, us-gov-east-1
- AWS China - cn-north-1, cn-northwest-1
See Model support by AWS Region in Amazon Bedrock for the latest list of supported regions and GOVCLOUD-COMPATIBILITY.md for partition-specific details.
Usage
Once configured, Bedrock models will appear in GitHub Copilot Chat's model selector. Simply:
- Open GitHub Copilot Chat
- Click on the model selector
- Choose a Bedrock model (they will be labeled with "Amazon Bedrock")
- Start chatting!
Supported Models
The extension automatically filters and displays only models that:
- Support tool calling (function calling), which is essential for GitHub Copilot Chat features like
@workspace, @terminal, and other integrations
- Are enabled in your Amazon Bedrock console (models must be authorized and available in your selected region)
Models Automatically Excluded
The extension automatically filters models to show only text generation models (using byOutputModality: "TEXT" in the Bedrock API). This excludes:
- Embedding models
- Image generation models
- Deprecated models (models with
LEGACY lifecycle status)
Models are sorted with newest inference profiles first (by creation/update date), making it easier to find recently released models.
Note: Some text models that appear in the list may have limited or no tool calling support (e.g., legacy Amazon Titan Text, AI21 Jurassic 2, Meta Llama 2 and 3.0). These will fail gracefully if tool calls are attempted.
Troubleshooting
Models not showing up
- Verify your AWS credentials are correctly configured
- Check that you've selected the correct AWS profile and region
- Enable models in the Amazon Bedrock console: Go to the Bedrock Model Access page and request access to the models you want to use
- Ensure your AWS account has access to Bedrock in the selected region
- Check the "Amazon Bedrock Models" output channel for error messages
Authentication errors
Verify your AWS credentials are valid and not expired
Check that your IAM user/role has the necessary Bedrock permissions:
Option 1: Use AWS Managed Policy (Recommended)
Attach the AmazonBedrockLimitedAccess managed policy to your IAM user or role. This policy includes all required permissions for using this extension.
Option 2: Custom Policy with Specific Permissions
If you prefer granular control, ensure your policy includes:
bedrock:ListFoundationModels - List available models (optional but recommended - extension will fallback to check Anthropic models only)
bedrock:GetFoundationModelAvailability - Check model access status (optional but recommended)
bedrock:ListInferenceProfiles - List cross-region inference profiles
bedrock:InvokeModel - Invoke models
bedrock:InvokeModelWithResponseStream - Stream model responses
License
MIT