Mistral-LLM-Ext READMEUse Ollama to run Mistral LLM models in Visual Studio Code. This extension allows you to interact with Mistral LLM models directly from Visual Studio Code, providing a seamless experience for developers and data scientists who want to leverage the power of large language models in their coding environment. This extension run Mistral models using Ollama, a tool for running large language models locally. It provides features like code completion, chat interfaces, and custom prompts, making it easier to integrate Mistral's capabilities into your development workflow.
ContributingWe welcome contributions to this project! If you have ideas for improvements, bug fixes, or new features, please feel free to submit a pull request or open an issue. LicenseThis project is licensed under the MIT License. See the LICENSE file for details. AcknowledgementsThis extension is built on top of the Ollama platform, which provides a powerful and flexible way to run large language models locally. We thank the Ollama team for their work in making this possible. Changelog
SupportIf you encounter any issues or have questions about using this extension, please open an issue on the GitHub repository One downsideThis may take while to responding depending of your resources of hardware and the size of the model. Table of Contents
Features
InstallationIn the command line: Option OneIf you are using MS-DOS:
Option 2If you are using PowerShell
Then load the LLM
Please, note that this will take some time depend on your internet connectionWhen it all finish, run the steps below:
Usage
Enjoy! RequirementsTo use this extension, you must have the following installed and configured on your machine:
|