Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Ask-LocalassistantNew to Visual Studio Code? Get it now.
Ask-Localassistant

Ask-Localassistant

zhihaoguo

|
13 installs
| (0) | Free
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Getting Started

To install the askllm Extension VS Marketplace

Introduction

  • call local large language(LLM) model in VSCode
  • support select code, context history as prompts to LLM.

Requirements

  • Download for Mac ollama first.
  • Windows ollama
  • The local model needs download from ollama library
# example for llama3
ollama run llama3   # auto download llama3
ollama list         # if list llama3, which success
ollama rm llama3    # remove llama3

Extension Settings

  • input 'ask llm' in the top buttom
  • choose model
  • ask question

For more information

This project is suitable for coding without an internet connection, as most of the time we just forget an API or some function names. It can be frustrating when there's no internet access. This project simply integrates Ollama's API with VSCode to minimize the need to switch between applications. The model's responses will be temporarily stored in Markdown, and can be saved if necessary.

  • ollama
  • ollama-js
  • llama3

If you think this extension needs more work or issue, feel free to report it or fork it. Thank you!

Enjoy!

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft