Open Repo Chat
An offline AI-powered code assistant for VS Code. Chat with your codebase using local LLMs via Ollama - no cloud, no API keys, complete privacy.
Features
- Offline and Private - All processing happens locally on your machine
- Chat with Your Code - Ask questions about your codebase and get contextual answers
- Smart Code Search - Uses RAG (Retrieval Augmented Generation) to find relevant code
- Code Generation - Generate code with one-click apply to files
- Conversation Memory - Maintains context across multiple messages
- Fast Local Inference - Powered by Ollama with GPU acceleration
Requirements
- Ollama must be installed and running
- Recommended: 8GB+ RAM for smaller models, 16GB+ for larger models
Quick Start
- Install Ollama
- Install this extension
- Open the Open Repo Chat sidebar (chat bubble icon)
- Go to Setup tab and verify Ollama is running
- Click Index Codebase in the Chat tab
- Start chatting with your code!
Settings
openRepoChat.chatModel - Ollama model for chat (default: qwen2.5-coder:7b)
openRepoChat.embeddingModel - Model for embeddings (default: nomic-embed-text)
openRepoChat.ollamaUrl - Ollama server URL
openRepoChat.contextChunks - Number of code chunks to include as context
openRepoChat.temperature - Response creativity (0-1)
openRepoChat.maxTokens - Maximum response length
Privacy
All data stays on your machine. No telemetry, no API keys required, works completely offline.
Release Notes
1.0.0
Initial release with RAG-powered code chat, conversation memory, and one-click code apply.
License
MIT
| |