LittlePrak is a Visual Studio Code (VS Code) extension designed to enhance your coding experience by providing intelligent autocompletions using Local LLM Models. This extension aims to streamline your coding workflow and boost your productivity by suggesting contextually relevant code snippets, function calls, and variable names as you type.
Local LLama is an advanced code analysis tool that leverages machine learning techniques to understand the structure and patterns within your codebase. It can accurately predict the next piece of code you are likely to write based on the context you're working in.
Smart Autocompletions: littlePrak offers intelligent autocompletions that adapt to your coding style and the specific project you're working on. It suggests relevant code snippets, function names, and variable names, making coding faster and more efficient.
Context-Aware Suggestions: The autocompletions are contextually aware and take into account the surrounding code, imported libraries, and your coding history. This ensures that the suggestions are accurate and aligned with your coding needs.
Suggestion Cycling: You can cycle through all the suggestions until you find one that's appropriate.
Local LLama Integration: littlePrak harnesses the power of Local LLama's advanced code analysis capabilities. The extension communicates with the Local LLama server running locally to provide real-time autocompletions based on the analysis of your codebase.
Remote LLM Integrations: If you aren't able to run LLMs on your personal PC, you can also host them on a remote server and configure it to be used here.
Follow these steps to start using littlePrak with Local LLama:
- Install the littlePrak extension from the VS Code Marketplace.
- Currently the project is only available using the dalai platform. In future, it will support all Llamas.
- Install and run dalai server using these instructions.
- Open your project in VS Code and start coding.
You can configure littlePrak to suit your preferences. To access the configuration settings, navigate to
Settings and search for "littlePrak". Here, you can adjust settings related to autocompletion behavior, suggestion frequency, and more.
- Create an MVP using dalai server. [WIP]
- Integrate oobagooba server api.
- Add more configurations options.
- Integrate options for remote servers.
Feedback and Support
We value your feedback and are committed to improving your experience with littlePrak. If you encounter any issues, have suggestions for improvements, or want to share your success stories, please reach out to us through the issue tracker.
This project is licensed under the MIT License.