Ollama Proxy Manager
Seamlessly manage your Ollama Proxy server directly from VS Code. This extension provides a powerful JSON editor with validation and automatic server management for routing LLM requests to multiple providers (OpenAI, Anthropic, etc.).
Features
- Auto-start Server: Automatically starts the Ollama Proxy server when VS Code opens
- Configuration Editor: JSON editor with validation for
models.config.json
- Server Management: Start, stop, and restart the proxy server from VS Code
- Status Bar: Real-time server status indicator
- Health Monitoring: Automatic health checks to ensure server is running
Configuration
The extension looks for configuration files in the following order:
~/.vscode/ollama-proxy/models.config.json (global user config)
<workspace>/.vscode/models.config.json (workspace config)
<workspace>/src/models.config.json (current location)
Commands
- Ollama Proxy: Open Configuration - Open the JSON configuration editor
- Ollama Proxy: Start Server - Start the proxy server
- Ollama Proxy: Stop Server - Stop the proxy server
- Ollama Proxy: Restart Server - Restart the proxy server
- Ollama Proxy: Show Output - Show server output logs
- Ollama Proxy: Check Server Status - Check current server status and health
Settings
ollamaProxy.autoStart (default: true) - Automatically start the server when VS Code opens
Installation
From VS Code Marketplace (Recommended)
- Open VS Code
- Go to Extensions (Cmd+Shift+X / Ctrl+Shift+X)
- Search for "Ollama Proxy Manager"
- Click Install
From VSIX File
code --install-extension ollama-proxy-manager-0.1.0.vsix
Requirements
- Bun 1.2.0 or higher
- Ollama Proxy project must be open in the workspace
- Node.js 18+ (for extension runtime)
Usage
- Open the Ollama Proxy project in VS Code
- The extension will automatically start the server (if
autoStart is enabled)
- Use Command Palette (
Cmd+Shift+P / Ctrl+Shift+P) to access commands
- Click the status bar item to open the configuration editor
- Edit configuration using the Monaco editor with JSON schema validation
- Save changes and restart the server to apply
Configuration Schema
The configuration file follows this schema:
{
"providers": {
"provider-name": {
"endpoint": "https://api.example.com/v1",
"apiKey": "your-api-key",
"apiType": "messages"
}
},
"models": {
"model-name": {
"provider": "provider-name",
"model": "gpt-3.5-turbo",
"apiType": "messages"
}
}
}
Security Note
API keys are stored in plain text in the configuration file. Ensure proper file permissions:
chmod 600 ~/.vscode/ollama-proxy/models.config.json
Development
To run the extension in development mode:
- Open the
extension folder in VS Code
- Press
F5 to launch Extension Development Host
- Test features in the new VS Code window
License
MIT