Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>PyllamaUINew to Visual Studio Code? Get it now.
PyllamaUI

PyllamaUI

BHUVANESH M

|
5 installs
| (0) | Free
Offline chat UI for Ollama via Python
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

PyllamaUI for VS Code 🧠🦙

A lightweight offline AI assistant inside VS Code powered by Python + Ollama.
Access local LLMs directly from your editor — no cloud, no telemetry, no internet required.

VS Code MIT License Ollama Python Version


🌟 About

PyllamaUI (VS Code Edition) is a VS Code extension that lets you chat with locally hosted LLMs using Python as the backend and Ollama as the model runner.

It's fully offline, privacy-friendly, and designed for low-resource systems.


✨ Features

  • 💬 Chat UI inside a VS Code panel
  • 🧠 Interact with local Ollama models (llama3, mistral, etc.)
  • 🐍 Python-powered backend (via run_prompt.py)
  • 🔌 Uses VS Code’s WebView for integrated GUI
  • 🚫 Works without internet – offline-first mindset
  • 📁 All user data processed locally

🛠️ Requirements

💻 System

  • VS Code 1.75+
  • Python 3.10+
  • ollama

✅ Model Recommendations

Use Case Model Name Approx. Size Description
📝 Text Chat tinyllama ~600 MB Lightweight text model
💻 Coding Help deepseek-coder ~700 MB Designed for code generation
⚡ All-in-One gemma:3b (aka gemma3n) ~5.5 GB Great for both chat & coding (Google DeepMind)

✅ If unsure, just install gemma3n for the best all-around experience.


To run a model:
ollama run gemma3n
ollama run tinyllama
ollama run deepseek-coder

📌 Special thanks to Ollama for making local LLMs accessible to all.


Developed with ❤️ by BHUVANESH M

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft