Skip to content
| Marketplace
Sign in
Visual Studio>Tools>Visual Studio AI Buddy
Visual Studio AI Buddy

Visual Studio AI Buddy

Bert O'Neill

|
1,059 installs
| (2) | Free
Allow Visual Studio to connect to a local LLM host (like Ollama) that is running DeepSeek and send it prompts, without compromising your company’s codebase or inadvertently leaking sensitive intellect
Download

AI Buddy (Visual Studio) Extension & Hosting DeepSeek LLM Locally

Introduction

This document provides details on how your company can setup an LLM Hoster locally, and connect the Visual Studio AI Buddy extension to it. For example, be able to prompt a DeepSeek LLM with the security of knowing that you’re prompt hasn’t left your company’s network.

Purpose

This Document provides a comprehensive overview on how your company can keep its AI prompts secret and not leak any sensitive information, even when using a LLM model from DeepSeek.

Scope

The scope of this document is to convey the steps involved in connecting to and using AI prompts from within Visual Studio to your local Hoster.

Prerequisites

  1. Knowledge of an LLM Hoster (like Ollama)
  2. Know your Hoster's API
  3. Know your Hoster’s API key (optional)
  4. Know the (Ollama) LLM model you wish to use (for e.g. DeepSeek-R1)

LLM Hoster (Ollama) Setup

Download and install Ollama using the latest installer for your environment - https://ollama.com/

1.jpg

Download LLM and Start Hoster

Select the appropriate LLM - https://ollama.com/search

Or go directly to the DeepSeek model - https://ollama.com/library/deepseek-r1 and select the appropriate parameter related model. Then click on the copy button (red arrow). Paste this command into a Terminal\Command dialog.

2.jpg

2025_02_23_174536.png

When the model has been downloaded and running, you can verify that Ollama is running from the System Tray:

2025_02_23_174552.png

You can also list the LLM’s that you have downloaded and which you can load by running the command Ollama list – this will list the available models:

2025_02_23_174602.png

Using AI Buddy

Once you have the AI Buddy extension installed, you must configure the settings against your local LLM Hoster. To do this, open the editor by loading any coding language file (as all the menu options are only available from the editor context menu), and right click in the editor pane. Select the menu option Your AI Buddy and then select the sub-menu item Configure AI Provider Settings.

Configure AI Provider Settings

2025_02_23_174714.png

This will display the AI Settings dialog, from here you must enter the:

  1. URL to your company’s Hoster’s API (for e.g. http://localhost:11434/api/generate)
  2. LLM Name (for e.g. deepseek-coder-v2)
  3. Software Language (used in some of the prompts to respond in a particular language)
  4. Testing Framework (used in some of the prompts to respond in a particular framework)

2025_02_23_174731.png

Generate Unit Tests

Highlight the code that you wish to generate unit tests for, within the editor. Right click the code, and select the submenu item Generate Unit Tests.

2025_02_23_174754.png

Your selected code will be pasted into a generic prompt for DeepSeek to analysis. An hourglass will appear as DeepSeek is generating the response.

2025_02_23_174808.png

This will then generate several Unit Tests based on your Testing Framework and Programming Language settings. From here, you can copy and paste the code into your test class.

2025_02_23_174820.png

Suggest (Coding) Improvements

Highlight the code you wish to get coding improvements for, then right click the code and select the submenu item Suggest Improvements.

2025_02_23_174838.png

Your highlighted code will be pasted into the generic prompt dialog and DeepSeek will analysis and then return with a response.

2025_02_23_174849.png

Below, you can see the response from DeepSeek for your prompt.

2025_02_23_174902.png

Comment Code

Highlight the code you wish to generate comments against, right click the code and select the submenu item Comment Code.

2025_02_23_174917.png

Your code will be pasted into the generic prompt dialog. Which DeepSeek will take as your prompt.

2025_02_23_174931.png

DeepSeek will respond with comments on your code, which you can use to comment your code or paste into a document (a User Guide for e.g.).

2025_02_23_174941.png

AI Prompt Dialog

There is a generic prompt dialog, where you can bring up an AI Prompt. You don’t need to highlight any code, as this will be a general prompt dialog.

2025_02_23_175036.png

From here, you can enter any prompt message for DeepSeek to analysis and respond to. Enter the prompt into the grey textbox and hit the Submit button.

2025_02_23_175048.png

After hitting the Submit button, the prompt is posted to DeepSeek.

2025_02_23_175101.png

DeepSeek will respond to your prompt as normal (if using the live DeepSeek API, but without your prompt leaving your network).

2025_02_23_175109.png

Pasting an Image With Prompt

If you host other LLM’s other than DeepSeek (for e.g. LLaVA 1.6 or Llama 3.2) you can paste (Ctrl V) an image into the Prompt textbox, and submit it along with a prompt. Currently DeepSeek doesn’t accept images as part of the Prompt.

2025_02_23_175119.png

The prompt message will expand the image and post the prompt message to the hosted LLM.

2025_02_23_175130.png

Health Check

You may want to check if the Hoster is up and running, you can do this with the submenu item Health Check.

2025_02_23_175139.png

A valid response:

2025_02_23_175150.png

An invalid response:

2025_02_23_175156.png

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft