Skip to content
| Marketplace
Sign in
Visual Studio Code>Linters>AI Code AuditorNew to Visual Studio Code? Get it now.
AI Code Auditor

AI Code Auditor

Jenil Gohel

| (0) | Free
AI-powered code reviewer extension for VS Code using local or cloud LLMs.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

🚀 AI Code Reviewer

An Intelligent, Real-Time Code Review Extension for Visual Studio Code

Powered by Local & Cloud LLMs (Codestral, Mistral, DeepSeek, Ollama, OpenAI)

AI Code Reviewer brings AI-powered code analysis, refactoring, and instant explanations directly into your workflow.
Built as a full VS Code extension, it integrates seamlessly with LM Studio, Ollama, or OpenAI-compatible APIs to deliver smart, real-time insights into your code.


⭐ Key Features

🔍 1. Full-File AI Code Review

Get a comprehensive analysis of:

  • Bugs & logic errors
  • Unhandled edge cases
  • Security issues
  • Performance problems
  • Readability / style issues (optional)

Runs with: Ctrl + Alt + R


✂️ 2. Selection-Based Review (Advanced)

Review only the highlighted part of code: Ctrl + Alt + Shift + R Great for isolating small blocks or debugging tricky logic.


💡 3. Explain Code (Beginner-Friendly)

Select any code → get a clear step-by-step explanation in Markdown. Ctrl + Alt + E Perfect for learning, documentation, or onboarding new developers.


🛠️ 4. AI Refactor (Auto-Fix & Improve Code)

Select code → AI rewrites it with:

  • Better readability
  • Error handling
  • Idiomatic style
  • Cleaner logic
  • Edge case protection

Run with: Ctrl + Alt + F


📊 5. Issues Panel (Built-In Dashboard)

A dedicated panel inside VS Code showing:

  • Issue number
  • Severity
  • Line number
  • Title
  • Explanation
  • Learning tip

Open anytime with: Ctrl + Alt + I


🔄 6. Continuous Code Review Mode (Real-Time)

Like GitHub Copilot diagnostics — but local AI powered.

When enabled:

  • The extension auto-reviews your file after you stop typing
  • Fully configurable debounce (default: 4s)

Enable in Settings: AI Code Reviewer → enableContinuousReview


⚙️ 7. Customizable Review Modes

Choose how strict the AI should be:

  • Relaxed – only major issues
  • Balanced – recommended
  • Strict – flags every possible issue

Disable entire categories:

  • Style issues
  • Performance concerns
  • Security warnings

🔄 8. Multi-Model Support

Works with ANY OpenAI-compatible API:

  • LM Studio
  • Codestral
  • Ollama
  • Mistral
  • DeepSeek
  • OpenAI
  • Any local GGUF model with an OpenAI endpoint

🧩 9. Fully Offline Capability

Using LM Studio or Ollama, the entire AI functionality can run:

  • Without Internet
  • Without sending code externally
  • Completely free

🧱 Architecture Overview

┌──────────────────────┐ │ VS Code UI │ │ (Editor, Commands) │ └───────────┬──────────┘ │ ▼ ┌──────────────────────┐ │ AI Code Reviewer │ │ (Extension Logic) │ └───────────┬──────────┘ │ Builds prompts ▼ ┌──────────────────────┐ │ Prompt Builder │ └───────────┬──────────┘ │ Sends requests ▼ ┌──────────────────────┐ │ LLM Client │ │ (OpenAI API Schema) │ └───────────┬──────────┘ │ ▼ ┌──────────────────────┐ │ Local / Cloud LLM │ │ Codestral, Mistral… │ └───────────┬──────────┘ │ Returns JSON ▼ ┌──────────────────────────┐ │ Diagnostics + Issue Panel │ └──────────────────────────┘


🖥️ Screenshots (Add Your Own Here!)

Add these after generating screenshots:

  • Full-file review
  • Selection review
  • Explain code panel
  • Refactor in action
  • Issues panel
  • LM Studio server logs
  • Continuous mode demo

📦 Installation

1. Clone the repo

git clone https://github.com/your-username/ai-code-reviewer.git
cd ai-code-reviewer

2. Install dependencies

npm install

3. Build the extension

npm run compile

4. Run the extension

In VS Code:

mathematica Run → Run Extension or press F5 This opens the Extension Development Host window.

🤖 LLM Setup (LM Studio Recommended) Download LM Studio Load a model (e.g., codestral-22b-v0.1) Start the server Copy the endpoint (usually: http://127.0.0.1:1234/v1) Update settings in VS Code:

AI Code Reviewer → apiBaseUrl = http://127.0.0.1:1234/v1
AI Code Reviewer → model = codestral-22b-v0.1
AI Code Reviewer → apiKey = test

💡 LM Studio does not require a real API key.

🎛️ Commands & Shortcuts Action Shortcut Command Review Entire File Ctrl+Alt+R codesense-mentor.reviewFile Review Selection Ctrl+Alt+Shift+R codesense-mentor.reviewSelection Explain Code Ctrl+Alt+E codesense-mentor.explainSelection Refactor Code Ctrl+Alt+F codesense-mentor.refactorSelection Open Issues Panel Ctrl+Alt+I codesense-mentor.openIssuesPanel Toggle Continuous Review Ctrl+Alt+C codesense-mentor.toggleContinuousReview

⚙️ Extension Settings Setting Description apiKey API key for OpenAI-compatible endpoints (LM Studio can use anything). apiBaseUrl Endpoint URL (LM Studio, Ollama, etc.) model The model to run (codestral, mistral, llama3, etc.) reviewMode relaxed, balanced, strict includeStyleIssues Show style/readability issues includePerformanceIssues Detect performance problems includeSecurityIssues Detect security issues enableContinuousReview Run automatic reviews while typing continuousReviewDebounceMs Delay before auto-review triggers

🛠️ Tech Stack TypeScript VS Code Extension API LM Studio (local LLMs) Node.js Codestral / Mistral / DeepSeek / Ollama OpenAI API schema

📄 Folder Structure


src/
  extension.ts         → main logic
  llmClient.ts         → handles API calls
  promptBuilder.ts     → builds system/user prompts
  issuesPanel.ts       → dashboard for issues
  diagnostics.ts       → VS Code diagnostics integration

📜 License MIT License — free for personal & commercial use.

💬 Contact If you use this project or want to collaborate, feel free to connect!

Author: Jenil Gohel Location: Waterloo, Canada

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft