Agent Maestro
Turn VS Code into your compliant AI playground! With Agent Maestro, spin up Cline or Roo on demand and plug Claude Code or Codex straight in through an OpenAI/Anthropic-compatible API.


Key Features
Turn VS Code into your compliant AI playground with powerful API compatibility and one-click setup:
- Universal API Compatibility: Anthropic (
/messages
) and OpenAI (/chat/completions
) compatible endpoints - use Claude Code, Codex or any LLM client seamlessly
- One-Click Setup: Automated configuration commands for instant Claude Code and Codex integration
- Headless AI Agent Control: Create and manage tasks through REST APIs for Roo Code and Cline extensions
- Comprehensive APIs: Complete task lifecycle management with OpenAPI documentation at
/openapi.json
- Parallel Execution: Run up to 20 concurrent RooCode (and its variants like Kilo Code) tasks with built-in MCP server integration
- Real-time Streaming: Server-Sent Events (SSE) for live task monitoring and message updates
- Flexible Configuration: Workspace-level settings, environment variables, and extension auto-discovery
Quick Start
Prerequisites
Agent Maestro assumes you already installed one of the supported AI coding extensions:
- Roo Code or its variants for comprehensive API control
- Claude Code for personal development routines.
- Codex for personal development routines.
Installation
Install the Agent Maestro extension from the VS Code Marketplace. Once activated, Agent Maestro automatically starts its API server on startup.
One-Click Setup for Claude Code
Configure Claude Code to use VS Code's language models with a single command Agent Maestro: Configure Claude Code Settings
via Command Palette.
This automatically creates or updates .claude/settings.json
with Agent Maestro endpoint and fills in available LLM models from VS Code.
That's it! You can now use Claude Code with VS Code's built-in language models.
One-Click Setup for Codex
Configure Codex to use VS Code's language models with a single command Agent Maestro: Configure Codex Settings
via Command Palette.
This automatically creates or updates ~/.codex/config.toml
with Agent Maestro endpoint and sets up GPT-5-Codex
as the recommended model.
Usage
Explore API Capabilities: Access the complete OpenAPI specification at http://localhost:23333/openapi.json
.
VS Code Commands: Access functionality through the Command Palette:
Server Management:
Agent Maestro: Start API Server
- Start the proxy API server
Agent Maestro: Stop API Server
- Stop the proxy API server
Agent Maestro: Restart API Server
- Restart the proxy API server
Agent Maestro: Get API Server Status
- Check current server status
MCP Server Management:
Agent Maestro: Start MCP Server
- Start the Model Context Protocol server
Agent Maestro: Stop MCP Server
- Stop the MCP server
Agent Maestro: Get MCP Server Status
- Check current MCP server status
Agent Maestro: Install MCP Configuration
- Install MCP configuration for supported extensions
Extension Management:
Agent Maestro: Get Extensions Status
- Check the status of supported AI extensions
New Configuration Commands:
Agent Maestro: Configure Claude Code Settings
- One-click Claude Code setup
Agent Maestro: Configure Codex Settings
- One-click Codex setup
Development Resources:
- API Documentation: Complete reference in
docs/roo-code/
- Type Definitions:
@roo-code/types
package
- Examples: Reference implementation in
examples/demo-site
(testing purposes)
Configuration
Environment Variables
You can customize Agent Maestro's server ports using environment variables:
Variable |
Description |
Default |
AGENT_MAESTRO_PROXY_PORT |
Proxy server port |
23333 |
AGENT_MAESTRO_MCP_PORT |
MCP server port |
23334 |
Usage:
# Set custom ports
export AGENT_MAESTRO_PROXY_PORT=8080
export AGENT_MAESTRO_MCP_PORT=8081
# Launch VS Code
code .
Note: Environment variables take precedence over extension settings.
Workspace-Level Configuration
You can configure Agent Maestro settings per workspace by adding them to your project's .vscode/settings.json
file:
{
"agent-maestro.defaultRooIdentifier": "roo-cline",
"agent-maestro.proxyServerPort": 23333,
"agent-maestro.mcpServerPort": 23334
}
Available Settings:
Setting |
Description |
Default |
agent-maestro.defaultRooIdentifier |
Default Roo extension to use |
"roo-cline" |
agent-maestro.proxyServerPort |
Proxy server port |
23333 |
agent-maestro.mcpServerPort |
MCP server port |
23334 |
This allows different projects to use different configurations without affecting your global VS Code settings.
API Overview
💡 Always refer to /openapi.json
for the latest API documentation.
Base URLs
- REST API:
http://localhost:23333/api/v1
- Anthropic API:
http://localhost:23333/api/anthropic
- OpenAI API:
http://localhost:23333/api/openai
- MCP Server:
http://localhost:23334
Anthropic-Compatible Endpoints
Perfect for GitHub Copilot and Claude Code integration:
POST /api/anthropic/v1/messages
- Anthropic Claude API compatibility using VS Code's Language Model API
POST /api/anthropic/v1/messages/count_tokens
- Token counting for Anthropic-compatible messages
OpenAI-Compatible Endpoints
Perfect for Codex and OpenAI model integration:
POST /api/openai/chat/completions
- OpenAI Chat Completions API compatibility using VS Code's Language Model API
RooCode Agent Routes
Full-featured agent integration with real-time streaming:
POST /api/v1/roo/task
- Create new RooCode task with SSE streaming
POST /api/v1/roo/task/{taskId}/message
- Send message to existing task with SSE streaming
POST /api/v1/roo/task/{taskId}/action
- Perform actions (pressPrimaryButton, pressSecondaryButton, cancel, resume)
VS Code Language Model API
Direct access to VS Code's language model ecosystem:
GET /api/v1/lm/tools
- Lists all tools registered via lm.registerTool()
GET /api/v1/lm/chatModels
- Lists available VS Code Language Model API chat models
Cline Agent Routes
Basic integration support:
POST /api/v1/cline/task
- Create new Cline task (basic support)
Documentation Routes
GET /openapi.json
- Complete OpenAPI v3 specification
Migration from v1.x
⚠️ Important changes when upgrading from v1.x:
Roo Task SSE Events Renamed
- Events now follow
RooCodeEventName
enum
- The
message
event remains unchanged (most commonly used)
- Removed events:
stream_closed
, task_completed
, task_aborted
, tool_failed
, task_created
, error
, task_resumed
OpenAPI Path Change
- Old:
/api/v1/openapi.json
- New:
/openapi.json
Roadmap
Our development roadmap includes several exciting enhancements:
- Production Deployment: Code-server compatibility for containerization and deployment
- Headless AI Agent Control: Complete REST API integration for Claude Code and Codex extensions with task lifecycle management
- Task Scheduler: Cron-like scheduling system for automated AI agent tasks and workflows
Contributions Welcome: We encourage community contributions to help expand Agent Maestro's capabilities and support for additional AI coding agents. We recommend using AI coding agents themselves to accelerate your development workflow when contributing to this project.
License
This project is licensed under the terms specified in the LICENSE file.