Overview Version History Q & A Rating & Review
LlamaVerse 🪄✨: VSCode Edition
Enter the LlamaVerse and chat with Ollama models directly in your VS Code editor!
Features
Connect to your local Ollama instance
Choose from available Ollama models
Chat with AI models in a dedicated VS Code webview
Requirements
VS Code 1.60.0 or higher
Ollama installed and running on your local machine
Installation
Install the extension from the VS Code Marketplace
Ensure Ollama is installed and running on your machine
Run the following command on terminal
Ollama serve
Usage
Open the Command Palette (Ctrl+Shift+P)
Type "Start LlamaVerse Chat" and select the command
Choose your desired Ollama model from the dropdown
Start chatting!
Known Issues
The extension currently only works with Ollama running on localhost
Limited error handling for network issues
Release Notes
0.1.0
Initial release of LlamaVerse: VSCode Edition
Stay Connected
I regularly update my Twitter and Substack with insights and discussions on computer science, machine learning, and deep learning. Follow me for the latest news, updates, and articles on these exciting fields!
Stay tuned for more content and updates!
Enjoy your journey through the LlamaVerse!