Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LlamaVerseNew to Visual Studio Code? Get it now.
LlamaVerse

LlamaVerse

appyzdl5

|
445 installs
| (1) | Free
Enter the LlamaVerse: Chat with Ollama models directly in VS Code
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LlamaVerse 🪄✨: VSCode Edition

LlamaVerse

Enter the LlamaVerse and chat with Ollama models directly in your VS Code editor!

Features

  • Connect to your local Ollama instance
  • Choose from available Ollama models
  • Chat with AI models in a dedicated VS Code webview

Requirements

  • VS Code 1.60.0 or higher
  • Ollama installed and running on your local machine

Installation

  1. Install the extension from the VS Code Marketplace
  2. Ensure Ollama is installed and running on your machine
  3. Run the following command on terminal
    Ollama serve
    

Usage

  1. Open the Command Palette (Ctrl+Shift+P)
  2. Type "Start LlamaVerse Chat" and select the command
  3. Choose your desired Ollama model from the dropdown
  4. Start chatting!

Known Issues

  • The extension currently only works with Ollama running on localhost
  • Limited error handling for network issues

Release Notes

0.1.0

Initial release of LlamaVerse: VSCode Edition


For more information

  • Ollama Official Website
  • VS Code Extension API

Stay Connected

I regularly update my Twitter and Substack with insights and discussions on computer science, machine learning, and deep learning. Follow me for the latest news, updates, and articles on these exciting fields!

  • Twitter: @appyzdl5
  • Substack: Subscribe here

Stay tuned for more content and updates!

Enjoy your journey through the LlamaVerse!

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft