Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>ollama-vscode-integrationNew to Visual Studio Code? Get it now.
ollama-vscode-integration

ollama-vscode-integration

Colcear Ionut

|
237 installs
| (0) | Free
Extension for VSCode to prompt local Ollama models.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Ollama VSCode Extension

An extension to integrate with the Ollama client, to prompt local language models right from your IDE.

Requirements

  1. Install Ollama - https://ollama.com/
  2. In your terminal, pull the model you want to use: "ollama pull modelName" - https://ollama.com/library/
  3. In VSCode settings, under Ollama Extension Settings, set the model you want to use with the extension.
    • The default model is deepseek-r1:1.5b, as it's the most accessible to run. Settings

Features

This extension provides a keyboard shortcut (Alt+Shift+Z) to prompt a local Ollama model with the selected code and an optional input box. Selection and input field

A custom keyboard shortcut can also be configured in VS Code settings for the ollama.prompt command. Settings

The output is presented on a secondary tab: Output

Extension Settings

This extension contributes the following settings:

  • ollama.enable: Enable/disable this extension.
  • ollama.modelName: The model name to use with Ollama.

1.0.0

Initial release.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft