LMS Completions is a lightweight VS Code extension that provides inline code completions using local AI models powered by LM Studio or Ollama.
It aims to bring fast, privacy-friendly code completion to your editor with zero cloud dependencies.
🚀 Features
⚡ Inline ghost code completions
🧠 Works with local models (LM Studio or Ollama)
🌍 Supports all languages
🤖 Supports all models
🔒 No data leaves your machine
🛠 Simple, minimal, fast
📃 Reads the whole related codes in project
📦 Requirements
Before using the extension, make sure you have:
Option A — LM Studio (recommended)
Download LM Studio
Download a model:
Recommended qwen2.5-coder-0.5b-instruct
Load the model inside LM Studio
Run LM Studio’s local server
Option B — Ollama
Install Ollama
Pull and run a supported model
Ensure the Ollama server is running
🧩 Setup
The extension currently requires no configuration.
Once LM Studio or Ollama is running
To Generate Code :
selected a Code
Run LM Studio: Generate Code command
Enjoy 🥹
Shortcut => ctrl + downarrow
📝 Known Limitations
Completions are not streamed yet
🛠 Roadmap
Streaming completions
Model selection
Custom API base configuration
Additional providers
Settings UI
🤝 Contributing
Contributions are welcome and encouraged!
Whether you want to fix a bug, improve completion quality, add new features, or help polish the codebase — you're invited.