Skip to content
| Marketplace
Sign in
Visual Studio Code>Machine Learning>Open LLM Council (OLC)New to Visual Studio Code? Get it now.
Open LLM Council (OLC)

Open LLM Council (OLC)

Chetan Jain

|
2 installs
| (1) | Free
Professional multi-model AI deliberation system - get comprehensive answers by consulting multiple AI models simultaneously through GitHub Copilot
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Open LLM Council (OLC)

Professional multi-model AI deliberation for VS Code

Open LLM Council brings the power of ensemble AI to your development workflow. Instead of relying on a single model's perspective, consult multiple AI models simultaneously and receive a synthesized, comprehensive answer.

VS Code Marketplace License: MIT


Features

Multi-Model Consultation

Consult multiple AI models (GPT-4, Claude, Gemini, and more) in a single query. Each model provides its unique perspective on your question.

Intelligent Synthesis

A synthesis model combines all responses into a unified, comprehensive answer that captures the best insights from each perspective.

Flexible Configuration

  • Council Size: Select minimal (3), standard (5), or extended (7+) models per query
  • Synthesis Control: Configure which model synthesizes the final answer

Multiple Interfaces

  • Chat Participant: Use @council directly in GitHub Copilot Chat
  • Dedicated Panel: Full-featured webview UI for comprehensive deliberations
  • Activity Bar: Quick access from the VS Code sidebar

How It Works

Open LLM Council implements a 3-stage deliberation process:

Stage Description
1. Gather Your question is sent to multiple AI models simultaneously
2. Review (Optional) Models review and critique each other's responses
3. Synthesize A chairman model combines all perspectives into a final answer

Usage

Chat Participant

In GitHub Copilot Chat, use the @council participant:

@council How should I structure a React application with authentication?

Commands

Command Description
@council <question> Standard council deliberation
@council /quick <question> Quick mode with fewer models
@council /debate <question> Full deliberation with peer review
@council /models List available AI models

Webview Panel

Open the dedicated UI panel:

  1. Press Ctrl+Shift+P (or Cmd+Shift+P on Mac)
  2. Run: Open LLM Council: Open Council Panel

Or click the Open LLM Council icon in the Activity Bar.

Applying Settings

After changing settings, click the 🔄 Refresh button in the sidebar header to apply them immediately.


Configuration

Access settings via File > Preferences > Settings and search for "Open LLM Council".

Setting Options Description
councilSize minimal, standard, extended Number of models to consult (2/3/4)
councilMember1-4 Any available model Choose specific council members
chairmanModel Any available model Model for final synthesis
enableDebateMode true/false Enable peer review stage
streamResponses true/false Stream responses in real-time

Available Models

Model Description
GPT-4.1 OpenAI's advanced reasoning model
GPT-4o OpenAI's fast, capable model
GPT-5 mini OpenAI's latest generation compact model
Grok Code Fast 1 xAI's fast code assistance model

Requirements

  • VS Code: Version 1.85.0 or higher
  • GitHub Copilot: Active subscription (Free, Pro, Business, or Enterprise)
  • GitHub Copilot Chat: Extension must be installed

Troubleshooting

"GitHub Copilot Not Ready" message

If you see this message, Copilot is not properly connected:

  1. Open GitHub Copilot Chat - Press Ctrl+Shift+I (or Cmd+Shift+I on Mac)
  2. Sign in - Make sure you're signed into your GitHub account
  3. Check subscription - Verify your Copilot subscription is active
  4. Reload - Click the "Retry" button in the extension or reload VS Code

Models not available

If certain models aren't appearing:

  1. Update VS Code - Ensure you have VS Code 1.85.0 or higher
  2. Update Copilot - Update GitHub Copilot Chat extension to the latest version
  3. Use /models command - Type @council /models in Copilot Chat to see available models

Extension not responding

  1. Reload Window - Press Ctrl+Shift+P → "Developer: Reload Window"
  2. Check Output - View → Output → Select "Open LLM Council" to see logs
  3. Disable/Enable - Try disabling and re-enabling the extension

Installation

From VS Code Marketplace

  1. Open VS Code
  2. Go to Extensions (Ctrl+Shift+X)
  3. Search for "Open LLM Council"
  4. Click Install

From VSIX

code --install-extension open-llm-council-2.0.0.vsix

Development

# Clone the repository
git clone https://github.com/ChetanJain281/llm-council.git
cd llm-council

# Install dependencies
npm install

# Compile
npm run compile

# Watch mode for development
npm run watch

# Run extension (Press F5 in VS Code)

Building for Production

# Type check, lint, and bundle
npm run package

# Create VSIX package
npm run package:vsix

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.


Acknowledgments

Inspired by Andrej Karpathy's llm-council concept of multi-model deliberation.


Made with care for the developer community

CI/CD (optional): A GitHub Actions workflow is included at .github/workflows/release.yml to build on tags and publish when VSCE_PAT is set in repository secrets.

Why Use a Council?

Single AI responses can be biased toward certain approaches. By consulting multiple DIFFERENT models:

  1. True diversity - Different models have different training and biases
  2. Better coverage - Claude might catch what GPT misses
  3. Reduced blind spots - Multiple vendors means multiple perspectives
  4. Collective wisdom - Synthesis combines the best of all

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft