Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Andes ModelsNew to Visual Studio Code? Get it now.
Andes Models

Andes Models

Ruslan Suleymanov

|
2 installs
| (1) | Free
Run local Ollama models in UI
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Andes

Current Version GitHub commit activity GitHub Repo stars GitHub License Author & Maintainer
Open VSX Release Date Open VSX Downloads Visual Studio Marketplace Downloads

A Visual Studio Code (or Cursor, Windsurf etc) extension that provides a local UI interface for Ollama models.

This project is no longer actively maintained and may contain bugs. I recommend using Cline instead.

Requirements

Node.js, Golang, VS Code (Cursor, Windsurf and Trae doesn't work somehow), make and Ollama.

How to build

Set environment variables in .env file, copying params from .env.example file. Make sure you have installed npm, VS Code (Cursor, Windsurf and Trae doesn't work) and Ollama. Before testing make sure you have code installed in your PATH.

  • Press CMD + Shift + P and search for Shell Command: Install 'code' command in PATH;
  • Restart VS Code.

And you are ready for extension testing:

  • First, run make in Terminal. It will instal necessary dependencies;
  • Go to Run -> Start Debugging (or just Press F5)
  • Wait until the new VS Code window will appear (Extension Development Host);
  • Press CMD + Shift + P and search for Andes;
  • Make sure ollama is serving: run ollama serve in Terminal on port 11434.

Features

  • Manage your installed Ollama models locally;
  • Chat with AI;
  • Observe the reasoning process and decision-making of AI models.

Changelog

Visit CHANGELOG.md.

TODO

Visit TODO.md.

About project

I created Andes to provide a simple, authentication-free VS Code extension that works exclusively with Ollama models in VSC, Windsurf, Trae, Cursor and other VSC forks. While other plugins like Cline & Continue support multiple AI providers with authentication needed, I wanted a focused solution specifically for local Ollama models.

Sadly it's written in TypeScript, but API for Markdown-to-HTML formating is in Golang. Initially i wanted to write this project in Golang, but it's simplier to write it in TypeScript since i can request Ollama's endpoints directly from TypeScript.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft