Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>Amazon Bedrock Provider for GitHub Copilot ChatNew to Visual Studio Code? Get it now.
Amazon Bedrock Provider for GitHub Copilot Chat

Amazon Bedrock Provider for GitHub Copilot Chat

Konstantin Vyatkin

|
1 install
| (0) | Free
| Sponsor
Native Amazon Bedrock integration for GitHub Copilot Chat using AWS named profiles. Brings Claude, Llama, and Mistral models into your editor with streaming, function calling, and vision support.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Amazon Bedrock Provider in GitHub Copilot Chat

A VSCode extension to use Amazon Bedrock in Copilot Chat using AWS named profiles.

Features

  • Native Amazon Bedrock Integration: Access Claude, Opus, DeepSeek, OpenAI OSS, and other models directly in GitHub Copilot Chat
  • AWS Profile Support: Uses AWS named profiles from your ~/.aws/credentials and ~/.aws/config files
  • Streaming Support: Real-time streaming responses for faster feedback
  • Function Calling: Full support for tool/function calling capabilities
  • Vision Support: Work with models that support image inputs
  • Cross-Region Inference: Automatic support for cross-region inference profiles
  • Prompt Caching: Automatic caching of system prompts, tool definitions, and conversation history for faster responses and reduced costs (Claude and Nova models)

Prerequisites

  • Visual Studio Code version 1.104.0 or higher
  • GitHub Copilot subscription
  • AWS credentials configured in ~/.aws/credentials or ~/.aws/config
  • Access to Amazon Bedrock in your AWS account

Installation

  1. Install the extension from the VSCode marketplace
  2. Configure your AWS credentials if you haven't already:
    • See AWS CLI Configuration for details
  3. Run the "Manage Amazon Bedrock Provider" command to select your AWS profile and region

Configuration

Setting up AWS Profiles

This extension uses AWS named profiles from your AWS configuration files. You can:

  1. Use the default AWS credentials chain (no profile selected)
  2. Select a specific named profile from your AWS configuration

To configure:

  1. Open the Command Palette (Cmd+Shift+P or Ctrl+Shift+P)
  2. Run "Manage Amazon Bedrock Provider"
  3. Choose "Set AWS Profile" to select from your available profiles
  4. Choose "Set Region" to select your preferred AWS region

Available Regions

  • US East (N. Virginia) - us-east-1
  • US East (Ohio) - us-east-2
  • US West (Oregon) - us-west-2
  • Asia Pacific (Mumbai) - ap-south-1
  • Asia Pacific (Tokyo) - ap-northeast-1
  • Asia Pacific (Seoul) - ap-northeast-2
  • Asia Pacific (Singapore) - ap-southeast-1
  • Asia Pacific (Sydney) - ap-southeast-2
  • Canada (Central) - ca-central-1
  • Europe (Frankfurt) - eu-central-1
  • Europe (Ireland) - eu-west-1
  • Europe (London) - eu-west-2
  • Europe (Paris) - eu-west-3
  • South America (São Paulo) - sa-east-1

Usage

Once configured, Bedrock models will appear in GitHub Copilot Chat's model selector. Simply:

  1. Open GitHub Copilot Chat
  2. Click on the model selector
  3. Choose a Bedrock model (they will be labeled with "Amazon Bedrock")
  4. Start chatting!

Supported Models

The extension automatically filters and displays only models that support tool calling (function calling), which is essential for GitHub Copilot Chat features like @workspace, @terminal, and other integrations.

Supported Model Families

Anthropic Claude:

  • Claude Sonnet 4.5 and Claude Sonnet 4
  • Claude Opus 4.1 and Claude Opus 4
  • Claude 3.7 Sonnet
  • Claude 3.5 Sonnet and Claude 3.5 Haiku (legacy)
  • Claude 3 family: Opus, Sonnet, Haiku (legacy)

OpenAI OSS:

  • gpt-oss-120b (120B parameters, near o4-mini performance)
  • gpt-oss-20b (20B parameters, optimized for edge deployment)

Amazon Nova:

  • Nova Premier, Nova Pro, Nova Lite, Nova Micro

Meta Llama:

  • Llama 3.1 and later (8B, 70B, 405B variants)
  • Llama 3.2 (11B, 90B)
  • Llama 4 (Scout, Maverick)

Mistral AI:

  • Mistral Large and Mistral Large 2
  • Mistral Small
  • Pixtral Large

Cohere:

  • Command R and Command R+

AI21 Labs:

  • Jamba 1.5 Large and Jamba 1.5 Mini

Writer:

  • Palmyra X4 and Palmyra X5

DeepSeek:

  • DeepSeek models (via Amazon Bedrock when available)

Models Automatically Excluded

The following models are filtered out as they don't support the Converse API tool use feature:

  • Amazon Titan Text (legacy models)
  • Stability AI models (image generation only)
  • AI21 Jurassic 2
  • Meta Llama 2 and Llama 3.0
  • All embedding models (Titan Embed, Cohere Embed)

Troubleshooting

Models not showing up

  1. Verify your AWS credentials are correctly configured
  2. Check that you've selected the correct AWS profile and region
  3. Ensure your AWS account has access to Bedrock in the selected region
  4. Check the "Bedrock Chat" output channel for error messages

Authentication errors

  1. Verify your AWS credentials are valid and not expired
  2. Check that your IAM user/role has the necessary Bedrock permissions:
    • bedrock:ListFoundationModels
    • bedrock:InvokeModel
    • bedrock:InvokeModelWithResponseStream

Credits

This extension is based on huggingface-vscode-chat and vscode-copilot-chat PR#1046.

AWS profile handling inspired by AWS Toolkit for Visual Studio Code.

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft