Choose your provider and enter API key (or select Ollama for free local use)
Select any code - explanation appears automatically!
Settings
Setting
Description
Default
vibeco.provider
LLM provider
Auto-detect
vibeco.apiKey
API key for your provider
-
vibeco.model
Model override
Provider default
vibeco.language
Explanation language (en/tr)
en
vibeco.explainOnFileOpen
Explain file role on open
true
vibeco.debounceMs
Delay before explaining
800ms
vibeco.ollamaUrl
Ollama server URL
http://localhost:11434
Why VibeCode?
If you use AI agents to write code, you know the struggle: the agent changes files, but you don't fully understand what changed or why. VibeCode bridges that gap by explaining every piece of code in simple terms, helping you learn as you build.