Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>Mistral LLM extNew to Visual Studio Code? Get it now.
Mistral LLM ext

Mistral LLM ext

marcoslara

|
10 installs
| (0) | Free
This extension provides a command to interact with the Mistral LLM.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

Mistral-LLM-Ext README

Use Ollama to run Mistral LLM models in Visual Studio Code.

This extension allows you to interact with Mistral LLM models directly from Visual Studio Code, providing a seamless experience for developers and data scientists who want to leverage the power of large language models in their coding environment.

This extension run Mistral models using Ollama, a tool for running large language models locally. It provides features like code completion, chat interfaces, and custom prompts, making it easier to integrate Mistral's capabilities into your development workflow.

Mistral LLM Extension GitHub Repo stars GitHub forks GitHub issues GitHub pull requests GitHub last commit GitHub contributors GitHub code size in bytes GitHub top language

version license vscode node npm ollama language language language language language language

Contributing

We welcome contributions to this project! If you have ideas for improvements, bug fixes, or new features, please feel free to submit a pull request or open an issue.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgements

This extension is built on top of the Ollama platform, which provides a powerful and flexible way to run large language models locally. We thank the Ollama team for their work in making this possible.

Changelog

  • 1.0.0: Initial release of the Mistral LLM Extension for Visual Studio Code.

Support

If you encounter any issues or have questions about using this extension, please open an issue on the GitHub repository

One downside

This may take while to responding depending of your resources of hardware and the size of the model.

Table of Contents

  • Features
  • Installation
  • Usage
  • Requirements
  • Contributing
  • License
  • Acknowledgements
  • Changelog
  • Support
  • Contact
  • FAQ
  • Troubleshooting
  • Known Issues

Features

  • Run Mistral Models: Execute Mistral models using Ollama directly from VS Code.
  • Code Completion: Get intelligent code completions powered by Mistral.
  • Chat Interface: Engage in conversations with the Mistral model, asking questions and receiving responses in real-time.
  • Custom Prompts: Create and manage custom prompts for specific tasks or queries.
  • Configuration Options: Easily configure the extension settings to suit your workflow.

Installation

In the command line:

Option One

If you are using MS-DOS:

set OLLAMA_DEBUG=0
ollama serve

Option 2

If you are using PowerShell
$env:OLLAMA_DEBUG="0"
ollama serve

Then load the LLM

ollama run mistral:latest

Please, note that this will take some time depend on your internet connection

When it all finish, run the steps below:

  1. Open Visual Studio Code.
  2. Go to the Extensions view by clicking on the Extensions icon in the Activity Bar on the side of the window or by pressing Ctrl+Shift+X.
  3. Search for "Mistral LLM Extension".
  4. Click on the Install button to install the extension.
  5. After installation, reload Visual Studio Code to activate the extension.

Usage

  1. Open the Command Palette: Press Ctrl+Shift+P (or Cmd+Shift+P on macOS).

Enjoy!

Requirements

To use this extension, you must have the following installed and configured on your machine:

  1. Ollama

    • Download and install Ollama from https://ollama.com/download.
    • Start the Ollama service before using the extension.
  2. Mistral Model

    • Pull the Mistral model in Ollama by running:
      ollama pull mistral
      
  3. Visual Studio Code

    • Make sure you are running a compatible version of VS Code (see engines.vscode in package.json).
  4. This Extension

    • Install this extension from the VS Code Marketplace or from a .vsix file.
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft