JarNox Command Copilot (Demo)
A simple Copilot-style sidebar for VS Code. Type a prompt and click "Apply" to generate code and insert it at your cursor(s) in the active editor.
This demo currently calls an Ollama-compatible text-generation endpoint and inserts the returned code. It can also execute a small set of file actions if the model replies with a specific JSON payload (see Actions).
Quick Start
npm install
npm run compile
In VS Code, press F5
to launch the Extension Development Host. In the host window:
- Open or create any file.
- In the Explorer view, open the sidebar panel "JarNox Command Copilot (Demo)".
- Type a prompt (e.g.,
create a function to add two numbers
) and click Apply.
You’ll see a brief "Generating code..." notification. The result is inserted at all active carets.
How It Works
- Sidebar UI posts your prompt to the extension.
- The extension calls an Ollama-compatible HTTP endpoint with model
llama3.2:latest
.
- The response is unwrapped (code fences/comments removed) and inserted into the editor.
- If multiple carets are active, the text is inserted at each caret.
Endpoint and model are currently hard-coded in src/extension.ts
under callOllama
. Default endpoint:
http://72.60.98.171:11434/api/generate
Actions (Experimental)
If the model responds with a JSON object like the following, the extension will attempt to execute it:
{ "action": "create_file", "path": "notes/todo.txt", "content": "Hello" }
Supported actions:
create_file
: creates a new file with content
inside the open workspace
append_file
: appends content
to an existing file
insert_code
: inserts content
at the current caret(s)
Notes:
- File actions require an open workspace; otherwise they are skipped with a warning.
- Paths are resolved relative to the first workspace folder.
- Content is cleaned of surrounding comments and code fences before writing.
Usage Tips
- Multi-cursor: place multiple carets to insert the generated text in several locations.
- Empty prompt: you’ll be asked to enter a prompt first.
- No active editor: you’ll be prompted to open a file to insert code.
Packaging (optional)
npm i -g @vscode/vsce
vsce package
Install the produced .vsix
via VS Code → Extensions → ⋯ → Install from VSIX.
Troubleshooting
- Generation failed: the Ollama endpoint may be unreachable; verify network access.
- No workspace open: file actions like
create_file
/append_file
are disabled.
- Nothing inserted: ensure a file is open and your prompt is non-empty.
Roadmap / Notes
- Endpoint and model selection via settings.
- Better status messages and logs.
- Safer handling and confirmation for file actions.
#� �c�o�p�i�l�o�t�-�c�l�o�n�e�-�b�e�t�a�
�
�