A VS Code extension that provides graphical interface for managing and interacting with your local Ollama models(like deepseek) directly within VS Code..
Usage
Open the command palette (Ctrl+Shift+P or Cmd+Shift+P)
Search for and select chat with ollama hub
Type your message in the text area
Select your preferred model from the dropdown
Click "Ask" or press Enter to send
Features
💬 Real-time Chat Interface - Streamlined chat UI within VS Code
🚀 Streaming Responses - Receive AI responses in real-time
📚 Session History - Maintains conversation context during VS Code session
🤖 Multi-Model Support - Switch between different Ollama models
⚡ Lightning Fast - Optimized for quick interactions
🔒 Local Processing - Works with locally hosted Ollama instances