🤖 Local AI PilotSupercharge your coding with Local AI models!A lightweight extension unlocking powerful AI models that run directly on your machine, keeping your code secure, private and responsive. ⛓ Key Features:AI-powered assistance: Get real-time code completion, chat with the AI about your code, and tackle complex tasks. Local Ollama models: Leverage the power of Ollama for a smooth offline experience and complete control over your data. Fully customizable: Use containers to tailor the extension to your specific needs and preferences. Completion with Context: Get suggestions tailored to your code's specific situation. Document Q&A: Ingest your documents and ask questions offline using RAG (Retrieval-Augmented Generation) Chat History: Keep track of your conversations and utilize past interactions for future reference. Remote Models: Supports remote inference using providers like Open AI, Gemini, Cohere, Anthropic, Codestral for resource-constrained devices. 🚀 Quick StartLocal setupThe extension supports Standalone Mode (Default) / Container Mode Configurable in Settings > Extensions > Local AI Pilot > Mode Standalone Mode - The extension connects directly with Ollama container/standalone instance. Container Mode - An intermediate API service bridges Ollama service with the extension allowing extra configurations/features. 📖 Usage💭 Code chat📝 Code completion
Note: Use LF line endings for proper formatting. 📝 Code completion with contextProvide the paths of files to use as additional context during code completion. 🛠️ Explain, Fix or Review code blocks📃 Document Q & A (container mode)📜 Chat History & Caching (container mode)Credits
|