MCP Ollama Manager Extension
A VS Code extension for managing the MCP Ollama Python server, providing a convenient interface to start, stop, configure and monitor your Ollama MCP server directly from Visual Studio Code.
Features
- 🚀 Server Management: Start, stop, and restart the MCP Ollama server with simple commands
- 📊 Status Monitoring: Real-time server status in the status bar with health checks
- ⚙️ Configuration Management: Easy configuration of server settings through VS Code settings
- 📝 Log Viewing: Built-in output channel for server logs and monitoring
- 🤖 Model Management: List and view details of available Ollama models
- 🔧 Auto-start Option: Configure the server to start automatically with VS Code
Requirements
Installation
From VS Code Marketplace
- Open VS Code
- Go to Extensions (Ctrl+Shift+X)
- Search for "MCP Ollama Manager"
- Click Install
From Source
- Clone this repository
- Install dependencies:
npm install
- Compile:
npm run compile
- Package:
npm run package
- Install the resulting
.vsix file using code --install-extension mcp-ollama-extension-*.vsix
Or upgrade:
Uninstall the old version (optional but recommended)
code --uninstall-extension internetics.mcp-ollama-extension
Install the new VSIX
code --install-extension mcp-ollama-extension-1.0.1.vsix
Configuration
The extension can be configured through VS Code settings. Open settings (Ctrl+,) and search for "MCP Ollama".
If your Ollama server is running on a different hostname (not localhost), you must configure it:
- Open VS Code Settings (Ctrl+,)
- Search for "MCP Ollama"
- Set "Mcp-ollama: Server Host" to your Ollama hostname (e.g.,
ai, 192.168.1.100, etc.)
The extension will connect to Ollama at http://{serverHost}:11434
Example configurations:
- Local Ollama:
localhost (default)
- Network hostname:
ai
- IP address:
192.168.1.100
Settings
| Setting |
Type |
Default |
Description |
mcp-ollama.serverPath |
string |
- |
Path to the mcp-ollama-python installation directory |
mcp-ollama.pythonPath |
string |
python |
Path to Python executable |
mcp-ollama.serverHost |
string |
localhost |
Host for the MCP server |
mcp-ollama.serverPort |
number |
8000 |
Port for the MCP server |
mcp-ollama.autoStart |
boolean |
false |
Automatically start the server when VS Code starts |
mcp-ollama.logLevel |
string |
info |
Log level for the server (debug, info, warning, error) |
Commands
The extension provides the following commands (available in the Command Palette, Ctrl+Shift+P):
MCP Ollama: Start Server - Start the MCP Ollama server
MCP Ollama: Stop Server - Stop the MCP Ollama server
MCP Ollama: Restart Server - Restart the MCP Ollama server
MCP Ollama: Show Server Status - Display current server status and configuration
MCP Ollama: Configure Server - Open configuration options
MCP Ollama: View Server Logs - Show the server output channel
MCP Ollama: List Available Models - List and manage Ollama models
Usage
First Time Setup
- Install the extension
- Open the Command Palette (Ctrl+Shift+P)
- Run "MCP Ollama: Configure Server"
- Select "Configure Server Path" and choose your mcp-ollama-python installation directory
- Configure other settings as needed (port, log level, etc.)
Starting the Server
- Use the command "MCP Ollama: Start Server" or
- Click the status bar item "MCP Ollama" or
- Enable auto-start in settings
Monitoring
- The status bar shows the server status (🟢 running, 🔴 stopped)
- View real-time logs with "MCP Ollama: View Server Logs"
- Check server status with "MCP Ollama: Show Server Status"
Managing Models
- Use "MCP Ollama: List Available Models" to see all installed models
- Select a model to view detailed information
- Models are shown in the Explorer view when the server is running
Development
Prerequisites
# Install dependencies
npm install
Building for Development
Development builds are optimized for fast compilation and debugging:
# Compile TypeScript (development mode)
npm run compile
# Watch mode - automatically recompile on file changes
npm run watch
Development build features:
- Fast compilation with
transpileOnly mode (3-5x faster)
- Detailed source maps (
eval-source-map) for better debugging
- No minification for readable output
- Verbose webpack logging
- Filesystem caching for faster rebuilds (50-80% faster)
Building for Production
Production builds are optimized for size and performance:
# Build for production (cross-platform)
npm run build:prod
# Package as VSIX for distribution
npm run package
Note: The build scripts use cross-env for cross-platform compatibility (works on Windows PowerShell, CMD, Linux, and macOS).
Production build features:
- Full TypeScript type checking
- Minification (40-60% smaller bundles)
- Optimized source maps (separate files)
- Deterministic module IDs for better caching
- Tree-shaking to remove unused code
- Single-bundle output (required for VS Code extensions)
Build Comparison
| Feature |
Development |
Production |
| Compilation Speed |
Fast (transpileOnly) |
Slower (full type check) |
| Bundle Size |
Larger |
40-60% smaller |
| Source Maps |
Inline (eval-source-map) |
Separate files |
| Minification |
No |
Yes |
| Debugging |
Excellent |
Good |
| Rebuild Time |
50-80% faster (cached) |
Standard |
Quick Commands
# Development workflow
npm install # Install dependencies
npm run watch # Start watch mode for development
# Production workflow
npm install # Install dependencies
npm run build:prod # Build for production
npm run package # Create VSIX package
# Install locally
code --install-extension mcp-ollama-extension-*.vsix
Testing
# Run tests
npm test
# Run in watch mode
npm run watch-tests
Debugging
- Open the project in VS Code
- Press F5 to launch a new VS Code instance with the extension
- Use the debugger to set breakpoints and debug the extension
- Check the Debug Console for logs and errors
- Use "MCP Ollama: View Server Logs" to see server output
Debug Tips:
- Development builds include detailed source maps for accurate debugging
- Use
logger.debug() for verbose logging (set log level to 'debug')
- The extension logs are stored in VS Code's log directory
- Server logs are available in the output channel
Changelog
1.0.0
- Initial release
- Basic server management (start, stop, restart)
- Status monitoring with health checks
- Configuration management
- Log viewing
- Model listing and management
- Auto-start capability
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
MIT License - see LICENSE file for details.
Support
- Report issues on GitHub Issues
- Check the Wiki for documentation
- Join our discussions for questions and suggestions
- MCP Ollama Python - The Python MCP server for Ollama
- Ollama - Get up and running with large language models locally
Note: This extension requires the MCP Ollama Python server to be installed separately. Please refer to the MCP Ollama Python documentation for installation instructions.