🚀 AI Code ReviewerAn Intelligent, Real-Time Code Review Extension for Visual Studio CodePowered by Local & Cloud LLMs (Codestral, Mistral, DeepSeek, Ollama, OpenAI) AI Code Reviewer brings AI-powered code analysis, refactoring, and instant explanations directly into your workflow. ⭐ Key Features🔍 1. Full-File AI Code ReviewGet a comprehensive analysis of:
Runs with: Ctrl + Alt + R ✂️ 2. Selection-Based Review (Advanced)Review only the highlighted part of code: Ctrl + Alt + Shift + R Great for isolating small blocks or debugging tricky logic. 💡 3. Explain Code (Beginner-Friendly)Select any code → get a clear step-by-step explanation in Markdown. Ctrl + Alt + E Perfect for learning, documentation, or onboarding new developers. 🛠️ 4. AI Refactor (Auto-Fix & Improve Code)Select code → AI rewrites it with:
Run with: Ctrl + Alt + F 📊 5. Issues Panel (Built-In Dashboard)A dedicated panel inside VS Code showing:
Open anytime with: Ctrl + Alt + I 🔄 6. Continuous Code Review Mode (Real-Time)Like GitHub Copilot diagnostics — but local AI powered. When enabled:
Enable in Settings: AI Code Reviewer → enableContinuousReview ⚙️ 7. Customizable Review ModesChoose how strict the AI should be:
Disable entire categories:
🔄 8. Multi-Model SupportWorks with ANY OpenAI-compatible API:
🧩 9. Fully Offline CapabilityUsing LM Studio or Ollama, the entire AI functionality can run:
🧱 Architecture Overview┌──────────────────────┐ │ VS Code UI │ │ (Editor, Commands) │ └───────────┬──────────┘ │ ▼ ┌──────────────────────┐ │ AI Code Reviewer │ │ (Extension Logic) │ └───────────┬──────────┘ │ Builds prompts ▼ ┌──────────────────────┐ │ Prompt Builder │ └───────────┬──────────┘ │ Sends requests ▼ ┌──────────────────────┐ │ LLM Client │ │ (OpenAI API Schema) │ └───────────┬──────────┘ │ ▼ ┌──────────────────────┐ │ Local / Cloud LLM │ │ Codestral, Mistral… │ └───────────┬──────────┘ │ Returns JSON ▼ ┌──────────────────────────┐ │ Diagnostics + Issue Panel │ └──────────────────────────┘ 🖥️ Screenshots (Add Your Own Here!)Add these after generating screenshots:
📦 Installation1. Clone the repo
2. Install dependencies
3. Build the extension
4. Run the extensionIn VS Code: mathematica
🤖 LLM Setup (LM Studio Recommended) Download LM Studio Load a model (e.g., codestral-22b-v0.1) Start the server Copy the endpoint (usually: http://127.0.0.1:1234/v1) Update settings in VS Code:
💡 LM Studio does not require a real API key. 🎛️ Commands & Shortcuts Action Shortcut Command Review Entire File Ctrl+Alt+R codesense-mentor.reviewFile Review Selection Ctrl+Alt+Shift+R codesense-mentor.reviewSelection Explain Code Ctrl+Alt+E codesense-mentor.explainSelection Refactor Code Ctrl+Alt+F codesense-mentor.refactorSelection Open Issues Panel Ctrl+Alt+I codesense-mentor.openIssuesPanel Toggle Continuous Review Ctrl+Alt+C codesense-mentor.toggleContinuousReview ⚙️ Extension Settings Setting Description apiKey API key for OpenAI-compatible endpoints (LM Studio can use anything). apiBaseUrl Endpoint URL (LM Studio, Ollama, etc.) model The model to run (codestral, mistral, llama3, etc.) reviewMode relaxed, balanced, strict includeStyleIssues Show style/readability issues includePerformanceIssues Detect performance problems includeSecurityIssues Detect security issues enableContinuousReview Run automatic reviews while typing continuousReviewDebounceMs Delay before auto-review triggers 🛠️ Tech Stack TypeScript VS Code Extension API LM Studio (local LLMs) Node.js Codestral / Mistral / DeepSeek / Ollama OpenAI API schema 📄 Folder Structure
📜 License MIT License — free for personal & commercial use. 💬 Contact If you use this project or want to collaborate, feel free to connect! Author: Jenil Gohel Location: Waterloo, Canada |