Explain-o-matic 🤖🔊
Voice-guided code walkthroughs. Great for LLM-generated code. Quickly grok what the code is about with high level summaries. Break down sections to get more details.


Features
🎙️ AI-Powered Code Breakdown & Voice Explanations
Automatically splits code changes into logical sections and speaks summaries using your OS's native TTS (no API needed).
Smart Navigation
Jump between sections with status bar controls and navigator, breakdown sections further by right clicking.

🧠 Supports reasoning models for smarter summaries
⚠️ Experimental Warning
This extension is experimental and not all features are available.
- ✅ Tested on macOS Sequoia
- ✅ Tested on Windows 11
- ⚠️ TTS not tested on Linux
Install
Marketplace

Download from Marketplace
From VSIX
code --install-extension explain-o-matic-1.0.0.vsix
Usage
- Run
Explain-o-matic: Start Review
from command palette
- Use these controls:
Next Section
→ Status bar → (or Explain-o-matic: Next Section
from command palette)
Stop Review
→ Status bar ⬛ (or Explain-o-matic: Stop Review
from command palette)
- Click sections in side panel to jump
- Right click sections in side panel to breakdown sections (or
Explain-o-matic: Breakdown Section
from command palette)
Add to settings.json:
"explainomatic.llm": {
"reasoner": {
"provider": "deepseek",
"model": "deepseek-reasoning",
"apiKey": "sk-your-key-here (or we read from ENV)",
},
"sectioner": {
"provider": "anthropic",
"model": "claude-3-5-sonnet-20240620",
"apiKey": "sk-your-key-here (or we read from ENV)",
"temperature": 0.1
}
}
If reasoner is enabled, it will be used to analyze the code and pass on it's output to the codeReviewer. Meant for models that expose their reasoning process only. Useful in getting better breakdowns.
The code reviewer breaks up the code into sections.
Supported LLM Providers
- [x] DeepSeek
deepseek
- [x] OpenAI
openai
- [ ] Google Vertex
vertex
- [x] Anthropic
anthropic
- [x] XAI
xai
- [x] Groq
groq
- [ ] OpenAI Compatible
openai-compatible
Feel free to add more providers. We're just wrapping the Vercel AI SDK.
Troubleshooting and more options
No Speech?
No API Key?
- Set
explainomatic.useEnvKeys
to true to read from your environment variables as well as the settings.json
Large Files?
Add warning threshold to settings:
Other Options
explainomatic.fileSizeWarning: 500
modify the large file size warning threshold
explainomatic.useReasoner: true/false
to enable/disable the reasoner
explainomatic.showStatusBarButtons: true/false
to show/hide the status bar buttons
Future plans
- [ ] Add more LLM providers
- [ ] Support for local LLMs
- [ ] GUI for configuration
- [ ] Support for Github Copilot API
Maybe
- [ ] Support for additional context from imported files
- [ ] Other voice options (OpenAI)
Author
@weloveoov
📜 License
MIT © 2025