Ollama VSCode ChatA Visual Studio Code extension that provides a user-friendly GUI for interacting with Ollama, allowing you to select and chat with different local LLMs. Features
RequirementsTo use this extension, you need:
UsageOpen the Ollama Chat Panel
Download New LLMs (Not yet implemented)To install new models from Ollama directly through VSCode:
Installation1. Install OllamaOllama is required to run local models. Follow the instructions below to install it: Linux & macOSRun the following command in your terminal:
WindowsDownload and install Ollama for Windows. 2. Install a Small LLMAfter installing Ollama, you need at least one model to start chatting. Here are some lightweight models you can try:
3. Install the Extension
Usage
ScreenshotsOllama Chat Interface in VSCodeSelecting an LLMNotes
LicenseMIT License |