JarNox Command Copilot (Demo)
JarNox Command Copilot is a VS Code sidebar that lets you describe the code you want, then inserts the generated result directly into the active editor (and every active cursor). The project demonstrates how a Copilot-style workflow can be layered on top of an Ollama-compatible language model endpoint.
Requirements
- VS Code 1.84 or newer
- Node.js 18+ (for building and testing)
- An accessible Ollama-compatible
POST /api/generate endpoint. By default the extension targets http://72.60.98.171:11434/api/generate, but you can point it at your own host.
Install & Build
npm install
npm run compile
Tip: npm run compile -- --watch keeps the TypeScript output up to date while you work.
Run the Extension
- Open the
co-pilot_jarnox folder in VS Code (code . from the project root).
- Press
F5, or start the Run Extension configuration from the Run and Debug view.
- A new VS Code window titled Extension Development Host opens—use this window to interact with the extension.
If the host window does not appear, make sure you opened the project root so VS Code can find .vscode/launch.json.
- In the host window, open or create a file in the editor.
- Open the JarNox Command Copilot (Demo) panel from the Explorer view (or run the command
JarNox: Show Command Copilot).
- Type a prompt such as “write a TypeScript function that adds two numbers” and click Apply.
- The extension calls the configured model, removes code fences and comments, and inserts the result at every active cursor.
While the request runs you will see a “Generating code…” notification. Multiple cursors receive the same response.
Model Endpoint & Configuration
The endpoint URL and model name are currently hard-coded in src/extension.ts inside the callOllama helper. Update the endpoint and model fields there to target your own service, then run npm run compile again.
const payload = {
model: 'llama3.2:latest',
prompt,
stream: false,
};
File Actions (Experimental)
The extension can execute simple file operations when the model responds with a JSON “action” object.
Supported actions:
create_file – create a new file with the provided content
append_file – append content to an existing file
insert_code – insert content at the current cursor positions
Example model response:
{ "action": "create_file", "path": "docs/todo.md", "content": "# TODO\n- item" }
Safeguards:
- Actions only run when a workspace is open; otherwise they are skipped with a warning.
- Paths are resolved relative to the first workspace folder.
- Content is cleaned of surrounding comments and code fences before writing.
Troubleshooting
- No host window after F5 – reopen VS Code from the project root so the debug configuration is detected.
- Generation failed – verify the Ollama endpoint is reachable; check the Output panel (JarNox Command Copilot channel) for error details.
- Nothing inserted – ensure an editor is active and the prompt is not empty.
- Action ignored – confirm the JSON response uses one of the supported action names and that a workspace folder is open.
Packaging
To produce a .vsix for manual installation:
npm install -g @vscode/vsce
vsce package
Install the resulting file via Extensions → … → Install from VSIX….
What’s Next
- Make the endpoint and model configurable through VS Code settings.
- Improve status messaging and error reporting inside the sidebar.
- Add confirmation dialogs before performing file actions.