Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LocaiNew to Visual Studio Code? Get it now.
Locai

Locai

Brendan Choi

|
6 installs
| (1) | Free
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LOCAI

LOCAI is a Visual Studio Code extension that provides a local LLM-powered AI coding assistant directly in your editor. Leverage the power of LLM models to get intelligent code assistance while keeping all your data private and secure on your own machine.

Installation

  1. Install the extension from the VS Code Marketplace
  2. Install ollama https://ollama.com/ and make sure ollama is running
  3. Download at least one LLM through the extension interface

Developer

Brendan Choi | 최만승 I am a Korean, New Zealand-based computer science student at the University of Auckland. In addition to my studies, I work as a personal trainer at CITYFITNESS NZ (Queen Street branch), so I can help you stay fit if too much programming leaves you feeling out of shape. I’m active on various social platforms, so please follow me to keep up with my latest projects and insights!

  • YOUTUBE: https://www.youtube.com/@brendanchoi7203
  • INSTAGRAM: https://www.instagram.com/brendanchoi_/
  • LINKEDIN: https://www.linkedin.com/in/manseung-choi-0447b4223/
  • FACEBOOK: https://www.facebook.com/brendan.choi.12/

Why?

This software is designed to empower everyone to code without the need to worry about sending sensitive code or information to external LLM servers. One of the key benefits is that you can work seamlessly without switching between multiple tabs. The release of DeepSeek was a game changer for me—it unlocked a new way for sLLM’s to run on slower hardware while still delivering quality responses. I am dedicated to continually updating the app to ensure it remains competitive with other chatbots. If you appreciate what I’m doing, buy me a coffee ☕️🫠

  • PAYPAL: brendanchoi0626@gmail.com
  • NZ BANK ACCOUNT: 12-3401-0083103-50

Features

  • Local LLM: Run LLM locally on your machine
  • Code Assistance: Get help with writing, explaining, and improving your code
  • Chat Management: Organize conversations in folders for better workflow
  • Multiple Model Options: Choose from different LLM model variants
  • Direct Code Selection: Select code snippets and perform actions via context menu
  • Fast Responses: Get AI help without sending your code to external servers

Requirements

  • Hardware: A modern CPU with at least 8GB RAM is required. For optimal performance, a machine with 16GB+ RAM and a dedicated GPU is recommended. (GB) next to the models are the VRAM requirements to run the specific model. You can find your VRAM on
  1. windows os: task manager -> select your gpu and check VRAM
  2. mac os: apple logo on the top left corner -> about this mac -> MAX VRAM = MEMORY x 0.75 Models with higher parameters will run slower and require more resource but will provide better responses. Models with lower parameters will run faster with less resource but won't be as capable.
  • Ollama: The extension requires Ollama to run the LLM models locally.
  • Disk Space: Depending on which models you install, you'll need specified amount of free disk space per model.

Usage

Starting a Chat

  1. Right-click anywhere in the editor
  2. Select "Load Locai Assistant" from the context menu
  3. A chat panel will open where you can start interacting with the model

Getting Help with Code

  1. Select the code you want help with
  2. Right-click and choose from options like:
    • "Explain This Code"
    • "Improve This Function"
    • "Find Potential Bugs"
    • "Ask with Custom Prompt..."

Managing Chats

  • Create folders to organize related conversations
  • Save and revisit past chats
  • Delete or rename chats and folders as needed

Known Issues

  • May have slower performance on machines with limited resources

Roadmap

  • [ ] Code action suggestions
  • [ ] Enhanced UI with syntax highlighting in responses

Release Notes

Privacy

Locai runs all models locally on your machine. No code or queries are sent to external servers, ensuring your code and intellectual property remain private.

Feedback and Contributions

  • Contact us with feedback or suggestions

Enjoy coding with your local AI assistant!

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft