AI Buddy (Visual Studio) Extension & Hosting DeepSeek LLM LocallyIntroductionThis document provides details on how your company can setup an LLM Hoster locally, and connect the Visual Studio AI Buddy extension to it. For example, be able to prompt a DeepSeek LLM with the security of knowing that you’re prompt hasn’t left your company’s network. PurposeThis Document provides a comprehensive overview on how your company can keep its AI prompts secret and not leak any sensitive information, even when using a LLM model from DeepSeek. ScopeThe scope of this document is to convey the steps involved in connecting to and using AI prompts from within Visual Studio to your local Hoster. Prerequisites
LLM Hoster (Ollama) SetupDownload and install Ollama using the latest installer for your environment - https://ollama.com/ Download LLM and Start HosterSelect the appropriate LLM - https://ollama.com/search Or go directly to the DeepSeek model - https://ollama.com/library/deepseek-r1 and select the appropriate parameter related model. Then click on the copy button (red arrow). Paste this command into a Terminal\Command dialog. When the model has been downloaded and running, you can verify that Ollama is running from the System Tray: You can also list the LLM’s that you have downloaded and which you can load by running the command Ollama list – this will list the available models: Using AI BuddyOnce you have the AI Buddy extension installed, you must configure the settings against your local LLM Hoster. To do this, open the editor by loading any coding language file (as all the menu options are only available from the editor context menu), and right click in the editor pane. Select the menu option Your AI Buddy and then select the sub-menu item Configure AI Provider Settings. Configure AI Provider SettingsThis will display the AI Settings dialog, from here you must enter the:
Generate Unit TestsHighlight the code that you wish to generate unit tests for, within the editor. Right click the code, and select the submenu item Generate Unit Tests. Your selected code will be pasted into a generic prompt for DeepSeek to analysis. An hourglass will appear as DeepSeek is generating the response. This will then generate several Unit Tests based on your Testing Framework and Programming Language settings. From here, you can copy and paste the code into your test class. Suggest (Coding) ImprovementsHighlight the code you wish to get coding improvements for, then right click the code and select the submenu item Suggest Improvements. Your highlighted code will be pasted into the generic prompt dialog and DeepSeek will analysis and then return with a response. Below, you can see the response from DeepSeek for your prompt. Comment CodeHighlight the code you wish to generate comments against, right click the code and select the submenu item Comment Code. Your code will be pasted into the generic prompt dialog. Which DeepSeek will take as your prompt. DeepSeek will respond with comments on your code, which you can use to comment your code or paste into a document (a User Guide for e.g.). AI Prompt DialogThere is a generic prompt dialog, where you can bring up an AI Prompt. You don’t need to highlight any code, as this will be a general prompt dialog. From here, you can enter any prompt message for DeepSeek to analysis and respond to. Enter the prompt into the grey textbox and hit the Submit button. After hitting the Submit button, the prompt is posted to DeepSeek. DeepSeek will respond to your prompt as normal (if using the live DeepSeek API, but without your prompt leaving your network). Pasting an Image With PromptIf you host other LLM’s other than DeepSeek (for e.g. LLaVA 1.6 or Llama 3.2) you can paste (Ctrl V) an image into the Prompt textbox, and submit it along with a prompt. Currently DeepSeek doesn’t accept images as part of the Prompt. The prompt message will expand the image and post the prompt message to the hosted LLM. Health CheckYou may want to check if the Hoster is up and running, you can do this with the submenu item Health Check. A valid response: An invalid response: |