Universal LLM proxy for VS Code, supporting GitHub Copilot and Azure Papyrus backends.
Enables Claude Code, Codex CLI, and other AI tools to use enterprise LLM services through a local HTTP proxy that handles authentication and API format translation (Anthropic ↔ OpenAI) transparently.
Features
GitHub Copilot backend — Direct API calls with OAuth device flow authentication
Azure Papyrus backend — Enterprise LLM access via Azure Identity
Format translation — Accepts both Anthropic and OpenAI request formats
Streaming support — Full SSE streaming with tool call argument accumulation