Llama VS Code Agent
Local VS Code extension that uses Ollama as a coding agent.
Features
- Ask the model questions about the current project
- Read and search files
- Write files with confirmation
- Run terminal commands with confirmation
Requirements
- VS Code
- Ollama running locally
- A local model pulled, for example:
ollama pull llama3.1:8b
Setup
npm install
npm run build
Press F5 in VS Code to open the Extension Development Host.
Suggested prompts
- Find possible re-render issues in the current file
- Search where this hook is used in the workspace
- Run tests and explain the main failure
- Refactor this component with minimal changes
Safety
- File writes require confirmation
- Terminal commands require confirmation
18) First improvements after MVP
- Add a sidebar chat with webview.
- Show diffs before applying writes.
- Limit writable paths to the workspace.
- Add
read_many_files and list_files tools.
- Add streaming responses.
- Add patch-based edits instead of full file overwrite.
- Add ripgrep integration for faster search.
- Add ignore rules like
.gitignore and custom allowlists.
19) Good first commit sequence
git init
npm init -y
npm install -D typescript esbuild @types/node @types/vscode vsce
Then create the files above and run:
npm run build
If Ollama is not running, start it and test:
ollama run llama3.1:8b
20) MVP scope recommendation
For your first version, keep only this:
- command palette integration
- ask prompt
- read file
- search workspace
- write file with confirm
- run command with confirm
Do not start with:
- autonomous edits in many files
- git automation
- background loops
- webview UI
- embeddings/RAG
Make it work small first.
code-loom
| |