Overview Version History Q & A Rating & Review
Memosk - AI Coding Assistant for VS Code
Memosk is your AI-powered coding sidekick that lives in your VS Code Activity Bar. Chat, explain code, fix bugs, run tests - all with privacy-first local Ollama fallback.
🚀 Quick Start
Install dependencies:
npm install
Build:
npm run build
Run in dev mode (F5)
Load in VS Code: Extensions > ... > Install from VSIX (package after build)
🧠 Recommended Ollama Setup (Free/Local)
Primary: cline + qwen2.5-coder:7b (best agent+model combo)
ollama pull qwen2.5-coder:7b
ollama pull cline
Alternatives:
opencode + qwen2.5-coder:7b
deepseek-coder-v2
codestral
qwen2.5-coder:14b (heavier/better)
Set memosk.defaultOllamaModel in settings.
🔌 Providers
Provider
Setup
Settings
Ollama (default)
ollama serve
memosk.ollamaHost
OpenAI
API key
memosk.openaiApiKey
Gemini
API key
memosk.googleApiKey
📱 Features
Activity Bar Chat (Memosk: Open Chat)
Explain Selection/File (Ctrl+Shift+P)
Improve Code (select → command)
Terminal/Problems Inspection (auto-capture errors)
Tagging (label files/terminals for context)
Run Tests (memosk.testCommand)
Streaming responses
Privacy : No uploads unless explicitly enabled+confirmed
🛡️ Privacy
Local first (Ollama)
Workspace files never sent without:
memosk.privacy.uploadWorkspaceFiles: true
Per-session confirmation dialog
Terminal outputs sanitized (paths/tokens stripped)
🧪 Test Checklist
[ ] Activity Bar shows Memosk icon
[ ] Chat view opens
[ ] memosk.ask quick input works
[ ] Commands: explain/improve work with selection
[ ] npm run build succeeds
[ ] F5 loads extension without errors
[ ] Ollama responds (with model pulled)
Troubleshooting
Ollama not responding: ollama serve, check http://localhost:11434
No model: ollama pull qwen2.5-coder:7b
CORS: Ensure Ollama allows browser requests
Recommended Combos (Settings → Model Routing)
Daily: cline + qwen2.5-coder:7b
Heavy: gpt-4o-mini (OpenAI)
Code-only: deepseek-coder-v2
Happy coding! 🚀