Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LLM Review ConnectorNew to Visual Studio Code? Get it now.
LLM Review Connector

LLM Review Connector

Martin Matvoz

|
2 installs
| (0) | Free
Review your code using a local (LAN) instance of Ollama
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

llm-review-connector README

Utilise your local Ollama instance to review code

Features

This extension enables the user to connect to a local (LAN) instance of Ollama and execute code reviews of either a) seleced code or b) a file or folder, containing files.

Requirements

A local (LAN) instance of Ollama with enough memory for a selected LLM. Default is qwen3-coder:30b, and requires at least 17.7 GB of RAM to load

Extension Settings

This extension contributes the following settings:

  • llmReviewConnector.host: Host address of Ollama instance. Default is http://localhost
  • llmReviewConnector.port: Host port of Ollama instance. Default is 11434
  • llmReviewConnector.model: Model, used for code review. Default is qwen3-coder:30b
  • llmReviewConnector.maxFileSizeKB: Maximum file size (KB) per file before chunking. Default is 200

Limitations

Due to the complexity of the task, some larger files/code snippets might behave unexpectedly. Thus, if an output could not be parsed, it is pasted to the output window instead.

Release Notes

1.0.0

Initial release of llm-review-connector

1.0.1

Removed private files, included by accident

2.0.0

Added right-click functionality and sidebar uotput window

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft