A privacy-first VS Code extension that connects to local Ollama and cloud AI providers (OpenAI, Anthropic, Gemini) for chat and coding help.
MCP integration is optional (can be enabled/disabled).
Features (current)
Sidebar chat view
Stream responses from Ollama
Extension ↔ webview messaging (ping/pong)
MCP Tool Integration (Phase 1 complete)
Code Autocomplete (Inline Ghost Text)
Smart Git Commit Message Generation
Requirements
VS Code
An active AI provider (e.g. Ollama running locally at http://localhost:11434, or an API Key for OpenAI, Anthropic, or Gemini)
Usage
Chat: Open the VSLLama view in the Activity Bar to start chatting with your local models.
Code Actions: Highlight code and right-click to access "Explain", "Fix", and "Generate" tools.
Git Commits: Generate context-aware commit messages from the Source Control view (evaluates both staged and unstaged changes).
Code Autocomplete: Get AI-powered inline code suggestions as you type (requires enabling in settings).