Skip to content
| Marketplace
Sign in
Visual Studio Code>Programming Languages>Ollama Proxy ManagerNew to Visual Studio Code? Get it now.
Ollama Proxy Manager

Ollama Proxy Manager

TuanTran

|
40 installs
| (0) | Free
Manage Ollama Proxy configuration and auto-start proxy server for routing LLM requests to multiple providers
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Ollama Proxy Manager

Seamlessly manage your Ollama Proxy server directly from VS Code. This extension provides a powerful JSON editor with validation and automatic server management for routing LLM requests to multiple providers (OpenAI, Anthropic, etc.).

Features

  • Auto-start Server: Automatically starts the Ollama Proxy server when VS Code opens
  • Configuration Editor: JSON editor with validation for models.config.json
  • Server Management: Start, stop, and restart the proxy server from VS Code
  • Status Bar: Real-time server status indicator
  • Health Monitoring: Automatic health checks to ensure server is running

Configuration

The extension looks for configuration files in the following order:

  1. ~/.vscode/ollama-proxy/models.config.json (global user config)
  2. <workspace>/.vscode/models.config.json (workspace config)
  3. <workspace>/src/models.config.json (current location)

Commands

  • Ollama Proxy: Open Configuration - Open the JSON configuration editor
  • Ollama Proxy: Start Server - Start the proxy server
  • Ollama Proxy: Stop Server - Stop the proxy server
  • Ollama Proxy: Restart Server - Restart the proxy server
  • Ollama Proxy: Show Output - Show server output logs
  • Ollama Proxy: Check Server Status - Check current server status and health

Settings

  • ollamaProxy.autoStart (default: true) - Automatically start the server when VS Code opens

Installation

From VS Code Marketplace (Recommended)

  1. Open VS Code
  2. Go to Extensions (Cmd+Shift+X / Ctrl+Shift+X)
  3. Search for "Ollama Proxy Manager"
  4. Click Install

From VSIX File

code --install-extension ollama-proxy-manager-0.1.0.vsix

Requirements

  • Bun 1.2.0 or higher
  • Ollama Proxy project must be open in the workspace
  • Node.js 18+ (for extension runtime)

Usage

  1. Open the Ollama Proxy project in VS Code
  2. The extension will automatically start the server (if autoStart is enabled)
  3. Use Command Palette (Cmd+Shift+P / Ctrl+Shift+P) to access commands
  4. Click the status bar item to open the configuration editor
  5. Edit configuration using the Monaco editor with JSON schema validation
  6. Save changes and restart the server to apply

Configuration Schema

The configuration file follows this schema:

{
  "providers": {
    "provider-name": {
      "endpoint": "https://api.example.com/v1",
      "apiKey": "your-api-key",
      "apiType": "messages"
    }
  },
  "models": {
    "model-name": {
      "provider": "provider-name",
      "model": "gpt-3.5-turbo",
      "apiType": "messages"
    }
  }
}

Security Note

API keys are stored in plain text in the configuration file. Ensure proper file permissions:

chmod 600 ~/.vscode/ollama-proxy/models.config.json

Development

To run the extension in development mode:

  1. Open the extension folder in VS Code
  2. Press F5 to launch Extension Development Host
  3. Test features in the new VS Code window

License

MIT

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft