Paver
By leveraging Granite models and open-source components such as Ollama and Continue, you can write, generate, explain, or document code with full control over your data, ensuring it stays private and secure on your machine. Getting StartedThis project features an intuitive UI, designed to simplify the installation and management of Ollama and Granite models. The first time the extension starts, a setup wizard is automatically launched to guide you through the installation process. You can later open the setup wizard anytime from the command palette by executing the "Paver: Setup Granite as code assistant" command. Installation Prerequisites
Step 1: Install the ExtensionOpen Visual Studio Code, navigate to the Extensions tab on the left sidebar, select "Paver," and click "install." The Continue.dev extension will be automatically added
as a dependency, if not already installed. If you installed Step 2: Install OllamaOnce the extension is running, the setup wizard will prompt you to install Ollama. The following Ollama installation options are available :
Once Ollama is installed, the page will refresh automatically. Depending on the security settings of your plateform, you may need to start Ollama manually the first time. Step 3: Install Granite ModelsSelect the Granite model(s) you wish to install and follow the on-screen instructions to complete the setup. After the models are pulled into Ollama, Continue will be configured automatically to use them, and the Continue chat view will open, allowing you to interact with the models via the UI or tab completion. About the StackIBM Granite ModelsThe Granite models are optimized for enterprise software development workflows, performing well across various coding tasks (e.g., code generation, fixing, and explanation). They are versatile "all-around" code models. Granite comes in various sizes to fit your workstation's resources. Generally, larger models yield better results but require more disk space, memory, and processing power. Recommendation: Using Model Size 2B should work on most machines. Use the 8b version if you're running on a high-end computer. For more details, refer to Granite Models. OllamaMany corporations have privacy regulations that prohibit sending internal code or data to third-party services. Running LLMs locally allows you to sidestep these restrictions and ensures no sensitive information is sent to a remote service. Ollama is one of the simplest and most popular open-source solutions for running LLMs locally. Continue.devContinue is the leading open-source AI code assistant. You can connect any models and contexts to build custom autocomplete and chat experiences inside VS Code and JetBrains.
For more details, refer to continue.dev. How to Contribute to this Project?Please check our Guidelines to contribute to our project. LicenseThis project is licensed under Apache 2.0. See LICENSE for more information. TelemetryWith your approval, the Paver extension collects anonymous
usage data and sends it to Red Hat servers to help improve our
products and services. Read our
privacy statement
to learn more. This extension respects the |