LMS Completions
LMS Completions is a lightweight VS Code extension that provides inline code completions using local AI models powered by LM Studio or Ollama.
It aims to bring fast, privacy-friendly code completion to your editor with zero cloud dependencies.
🚀 Features
- ⚡ Inline code completions
- 🧠 Works with local models (LM Studio or Ollama)
- 🌍 Supports all languages
- 🔒 No data leaves your machine
- 🛠 Simple, minimal, fast
📦 Requirements
Before using the extension, make sure you have:
Option A — LM Studio (recommended)
- Download LM Studio
- Download the model:
qwen2.5-coder-0.5b-instruct
- Load the model inside LM Studio
- Run LM Studio’s local server
Option B — Ollama
- Install Ollama
- Pull and run a supported model
- Ensure the Ollama server is running
🧩 Setup
The extension currently requires no configuration.
Once LM Studio or Ollama is running, LMS Completions will automatically attempt to connect and provide inline completions.
- No settings
- No commands
- Just install and start coding
📝 Known Limitations
- Completions are not streamed yet
- Only basic inline completions at this stage
🛠 Roadmap
- Streaming completions
- Model selection
- Custom API base configuration
- Additional providers
- Settings UI
🤝 Contributing
Contributions are welcome and encouraged!
Whether you want to fix a bug, improve completion quality, add new features, or help polish the codebase — you're invited.
👉 GitHub Repository: https://github.com/dudes-company/lms-ai
Feel free to submit issues, feature requests, and pull requests.
📄 License
MIT License