ConvoCode Chat
A Visual Studio Code extension that lets you chat with Ollama models directly from your editor.
Features
- Send messages to any local or remote Ollama model
- Live streaming of responses with partial updates
- Markdown rendering in chat history
- Select from one or more configured models
- Easy build, debug, and package workflow
Prerequisites
Installation
Clone the repo and install dependencies:
git clone https://github.com/yourname/convocode.git
cd convocode
npm install
Development
Compile TypeScript and watch for changes:
npm run watch
Launch the Extension Development Host:
- Open this folder in VS Code
- Press
F5 to launch a new window with ConvoCode loaded
In the development host, open the command palette (Ctrl+Shift+P ) and run:
ConvoCode Chat: Start Conversation
Type your messages into the input box and hit Send.
Configuration
Settings can be modified in your VS Code settings.json under the convocode namespace:
{
"convocode.baseUrl": "http://127.0.0.1:11434", // Ollama API endpoint
"convocode.model": "qwen3:0.6b", // Default model
"convocode.models": ["qwen3:0.6b", "other-model:tag"] // Available models
}
Packaging & Publishing
Compile the extension:
npm run compile
Create a VSIX package:
npx vsce package
Install the generated .vsix locally:
code --install-extension convocode-0.0.1.vsix
(Optional) Publish to the VS Code Marketplace:
npx vsce publish
Release Notes
0.0.1
- Initial release with streaming chat, model selection, and markdown support.
Contributions, issues, and feature requests are welcome!
Developed by Your Arun Kumar Tiwary
| |