OrcA is a powerful VS Code extension that orchestrates multiple LLM agents (Gemini, Claude, OpenAI, DeepSeek, and Local models via Ollama) to work together seamlessly on complex coding tasks.
Features
Multi-Agent Orchestration: A central "Manager" LLM coordinates sub-tasks across different specialist models.
Responsibility Splitting: Automatically generates an orca_plan.md to define and track agent assignments.
Conflict Avoidance: Logic to prevent multiple agents from making clashing edits to the same code regions.
Clean Group Chat: A unified chat interface where agents collaborate without their internal reasoning process interfering with others.
Deep Integration: Supports Gemini, Anthropic (Claude), OpenAI (GPT-4o), DeepSeek, and local models through Ollama.
Getting Started
Install the Extension: Install OrcA from the VS Code Marketplace.
Configure API Keys:
Open VS Code Settings (Ctrl+,).
Search for OrcA.
Enter your API keys for Gemini, Claude, OpenAI, and/or DeepSeek.
Start Orchestrating:
Press Ctrl+Shift+P and type OrcA: Start Orchestration.
Enter your task and watch OrcA coordinate the agents!
Architecture
OrcA follows a provider-based architecture, making it easy to add or swap LLM backends. The core Orchestrator ensures that even with multiple agents active, the workspace remains consistent and the development flow is smooth.
Requirements
VS Code v1.80.0 or higher.
API keys for the respective LLM providers you wish to use.
(Optional) Ollama running locally for local model support.
License
This project is licensed under the MIT License - see the LICENSE file for details.