Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>IndirectTek Vibe EngineNew to Visual Studio Code? Get it now.
IndirectTek Vibe Engine

IndirectTek Vibe Engine

IndirectTek

|
6 installs
| (0) | Free
Autonomous Local AI Agent. Maximum Badassary.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

IndirectTek Vibe Engine 🤖⚡

"Autonomous Local AI Agent. Maximum Badassary."

IndirectTek Vibe Engine is a local-first, autonomous VS Code extension designed for developers who want code, not conversation. Powered by your local Ollama instance, it turns your IDE into a self-healing, self-scaffolding autonomous agent.

No API keys. No data leaks. Pure local compute.


🚀 The "Vibe Coding" Philosophy

Most AI coding assistants are just chat bots. You paste code, they explain it, you copy-paste it back. Refusal.

The Vibe Engine is different. It is an Autonomous Execution Interface.

  1. You Prompt: "Refactor this file to use TypeScript."
  2. It Thinks: "I need to read package.json first." -> Reads it.
  3. It Acts: "I need to install typescript." -> Offers Run Button.
  4. It Edits: "I am rewriting server.js now." -> Directly modifies your buffer.

It keeps its own context, manages its own memory, and executes until the job is done.

✨ Features

  • 🔒 100% Local: Works with Ollama (Qwen 2.5, Llama 3, DeepSeek Coder). Your IP, your rules.
  • 🧠 Continuous Agentic Loop: It doesn't forget. If a command fails, it analyzes the error and tries a fix in the next turn.
  • ⚡ Fuzzy Match Editing: No more broken "Search string not found" errors. The engine uses whitespace-insensitive fuzzy matching to locate and patch code blocks surgically.
  • 🛠️ Full Tool Access:
    • <CREATE_FILE>: Can scaffold entire projects (React, Node, Python) from scratch.
    • <edit_file>: Modifies existing files in-place.
    • <RUN_COMMAND>: Suggests terminal commands (npm, git, docker) with one-click execution.
    • <READ_FILE>: autonomously reads files to gather context.

🛠️ Setup & Configuration

  1. Install Ollama: Download Ollama and pull a model:

    ollama pull qwen2.5:72b
    # or
    ollama pull deepseek-coder:33b
    
  2. Install the Extension:

    • Download the .vsix release.
    • In VS Code: Extensions -> ... -> Install from VSIX.
  3. Configure:

    • Click the Gear Icon ⚙️ in the IndirectTek chat sidebar.
    • URL: Default http://192.168.86.39:11434 (Change to http://localhost:11434 if running locally).
    • Model: Set to your pulled model (e.g., qwen2.5:72b).

🎮 Usage Examples

The "Zero to Hero" Scaffold:

"Initialize a new Next.js project in this folder. Create a basic homepage that says 'Powered by IndirectTek'."

The Refactor:

"Refactor server.js to use ES6 imports instead of require. Make sure to update package.json to type: module."

The Auto-Fix:

"The build failed with an implicitly has an 'any' type error. Fix it."

📜 License

MIT License. Built with ❤️ and ☕ by IndirectTek.


Verified to run on Apple Silicon & Linux.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft