Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Ollama ChatNew to Visual Studio Code? Get it now.
Ollama Chat

Ollama Chat

ashishalex

|
3,014 installs
| (0) | Free
Chat offline with models available to download from Ollama
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Ollama Chat

VS Code extension that allows you to chat with self hosted models offline that can be downloaded from ollama.

ollam-chat-demo

How to use ?

  1. Install Ollama and download a model.

    ollama run qwen2.5-coder
    
  2. Open terminal and run ollama serve or manually open Ollama app

  3. Open the command palette in VSCode by pressing Cmd+Shift+P (Mac) or Ctrl+Shift+P (Windows/Linux), then run the Ollama Chat command. This will open the chat window shown in the screenshot.

TODO

  • [ ] feat: chat with a file or a selection
  • [ ] feat: show error if user does not have ollama started either manually via opening the app or using ollama serve
  • [ ] feat: show error if user does not have a model. Show them example command to install model
  • [ ] feat: restrict user to certain number of tokens when sending message ?
  • [ ] feat: processes pdfs ?
  • [ ] feat: audio search
  • [ ] build: Do we need to use a build system like web pack ?
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft