Involvex Smart Autocomplete is a cutting-edge VS Code extension that delivers intelligent, context-aware tab completions by dynamically integrating multiple AI models and providers.
Whether you rely on powerful cloud models like OpenAI and Anthropic or prefer privacy-focused local LLMs (via Ollama, LM Studio), this extension adapts to your workflow, learns your patterns, and helps you code faster.
🚀 Features
🧠 Adaptive Intelligence
Multi-Context Awareness: analyzes your active file, cursor position, syntax, and project structure.
Project-Wide Analysis: Understands cross-file dependencies and repository conventions.
Git Awareness: Prioritizes completions relevant to recent changes and active branches.
🔌 Provider Agnostic
Seamless Switching: Toggle between OpenAI, Anthropic, Gemini, Mistral, and Local LLMs instantly.
Fallback Mechanisms: Automatically switches providers if one hits rate limits or goes down.
Local-First Support: First-class support for llama.cpp, Ollama, and LM Studio.
⚡ Performance & Privacy
Smart Caching: Caches frequent requests to minimize latency and cost.
Privacy Controls: Opt-in telemetry. Your code stays yours.
Debounced Triggers: Auto-activates only in high-confidence scenarios to avoid distraction.
🛠️ Tech Stack using Bun
This project is built using Bun for fast dependency management and scripting.
Funtion: VS Code Extension API & LSP
Inference: WebAssembly & HTTP Streaming
Language: TypeScript
📦 Installation
Install the extension from the VS Code Marketplace.
Open your project.
Configure your API keys in the extension settings.
Start coding!
📖 Documentation
Setup Guide - Installation and initial configuration.