Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>KZRO Local AI AgentNew to Visual Studio Code? Get it now.
KZRO Local AI Agent

KZRO Local AI Agent

gggdgh

|
2 installs
| (0) | Free
KZRO is a local-first AI coding agent for VS Code. Chat with your codebase, generate safe multi-file patches, auto-apply changes with checkpoints/undo, and validate with an agentic plan→act→validate loop. Works great with Ollama locally and can also use cloud models via REST.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

KZRO Local AI Agent (VS Code)

KZRO is a local-first AI coding agent for VS Code.

It provides a chat-based assistant that can:

  • Read editor/workspace context on demand
  • Propose and apply safe multi-file patches
  • Validate changes (TypeScript compile + diagnostics)
  • Iterate with an agentic plan→act→validate loop

KZRO is designed to be practical and careful:

  • It prefers minimal diffs.
  • It applies patches through a controlled patch engine.
  • It uses checkpoints so you can undo/restore changes.

What you get

Chat + actions

  • Chat with your codebase inside a Webview panel.
  • Quick action buttons (Plan / Fix / Refactor / Explain / Search) to steer the next request.
  • Agent feed that narrates steps like planning, applying patches, validating, and auto-fixing.

Safe patch workflow

  • Patch-only output contract in prompts (when code changes are needed).
  • Batch apply: if the model produces multiple patches, KZRO can apply them as a batch and validate once.
  • Checkpoints + undo/restore: patch application snapshots allow rolling back safely.
  • Project scaffolding support: patches can create new files when needed.

Clarifying questions (Options UI)

When something is ambiguous, KZRO can ask you a question with clickable options inside the UI.

Provider support

Local-first with Ollama, plus optional cloud providers via REST endpoints:

  • Ollama (local)
  • OpenAI
  • OpenAI-compatible (custom base URL)
  • Anthropic
  • Gemini

Privacy & safety notes

  • Local by default: with Ollama selected, requests stay on your machine.
  • Cloud is optional: if you enable a cloud provider, your prompt/context is sent to that provider’s API.
  • API keys: set keys via VS Code settings. Do not hardcode keys in your workspace.
  • Commands: running shell commands is gated by an allowlist (you can control which commands are allowed).

Usage

  1. Install/configure a model (recommended: Ollama locally).
  2. Run the command:
    • KZRO: Open AI Assistant
  3. Ask for:
    • A fix (errors, diagnostics)
    • A refactor (small, targeted)
    • A patch across multiple files
    • A plan for multi-step work

Settings

All settings are under the kzro.* namespace.

  • kzro.provider
  • kzro.ollamaBaseUrl
  • kzro.defaultModel
  • kzro.cloudModel
  • kzro.openaiApiKey
  • kzro.openaiCompatibleBaseUrl
  • kzro.openaiCompatibleApiKey
  • kzro.anthropicApiKey
  • kzro.geminiApiKey
  • kzro.agentMode
  • kzro.agentAutoApply
  • kzro.agentMaxAutoApply
  • kzro.commandAllowlist
  • kzro.webSearchProvider
  • kzro.searxngBaseUrl

Requirements

  • VS Code
  • Node.js (recommended v18+)
  • Ollama running locally (default: http://127.0.0.1:11434)

Run (dev)

  1. Install dependencies:
    • npm i
  2. Compile:
    • npm run compile
  3. Press F5 in VS Code to launch the Extension Development Host.
  4. Run command: KZRO: Open AI Assistant

Settings

  • kzro.ollamaBaseUrl
  • kzro.defaultModel
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft