Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>vide-codeNew to Visual Studio Code? Get it now.
vide-code

vide-code

Vide Code

|
2 installs
| (0) | Free
Assistente de IA para VS Code da vide-code
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

vide-code

vide-code is an AI engineering workspace inside VS Code: you can ask, plan, execute, review, and automate in a single flow.

It is not just "chat for code". It is a product built to turn real tasks into real delivery with quality, traceability, and speed.

Why use it

  • Faster delivery: from quick bug fixes to full features with clear progress.
  • Less context switching: chat, terminal, review, and automation in one place.
  • Quality-first workflow: AI-assisted code review, validations, and outcome-driven iteration.
  • Scalable execution: combine modes, agents, and external tools for complex tasks.

Modes that match your moment

  • Ask: understand code, unblock decisions, and move fast on technical questions.
  • Agent: execute real changes across files, commands, and project workflows.
  • Planner: break larger goals into clear, actionable steps before implementation.
  • Dag: orchestrate dependent and parallel steps for predictable execution flows.

Core product capabilities

  • Multi-tab chat: keep isolated contexts per stream (feature, hotfix, refactor, review).
  • vide-code CLI: the same chat experience in the terminal, in a faster advanced mode for high-speed execution.
  • AI-assisted Code Review: surface risks, regressions, and maintainability improvements.
  • MCP: connect tools, APIs, and external services to expand what the assistant can do.
  • Multiagent: split complex work across specialized agents to increase throughput.
  • Custom Agents: define agents with custom instructions, style, and scope.
  • Skills: plug in reusable capabilities to standardize recurring team workflows.

AI models and providers

vide-code is provider-flexible so you can optimize for quality, speed, privacy, or cost by task:

  • OpenAI: strong quality for coding, planning, review, and automation.
  • Gemini: robust option for broad-context analysis and engineering work.
  • Grok: fast alternative for exploration and technical iteration.
  • Ollama: run local models for stronger privacy and infrastructure control.
  • Bedrock: access models through AWS with enterprise-ready ecosystem integration.
  • OpenAI Compatibility: connect providers that support OpenAI-compatible APIs.

Beyond chat models, you can also configure embedding models compatible with OpenAIEmbeddings, enabling semantic search, RAG workflows, and richer project context retrieval.

Recommended quick start

  1. Install the extension in VS Code.
  2. Configure your provider (OpenAI, Gemini, Grok, Ollama, Bedrock, or OpenAI Compatibility).
  3. Open two chat tabs: one for implementation and one for review.
  4. Use Planner to structure, Agent to execute, and Code Review to validate.

vide-code: less friction to decide, more capacity to build.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2026 Microsoft