Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LM Studio CopilotNew to Visual Studio Code? Get it now.
LM Studio Copilot

LM Studio Copilot

Wayne Harper

|
73 installs
| (0) | Free
Connects LM Studio to VS Code with inline ghost completions and streaming chat
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

🧠 LM Studio Copilot

An experimental, agentic VS Code extension that connects your local LM Studio instance to your development folder. Ask questions, refactor code, stream completions, and scaffold projects from prompts—all backed by your preferred local model.

⚙️ Features

  • ✨ Chat with LM Studio from inside VS Code via /v1/chat/completions
  • 📁 Select a folder to serve as your project root
  • 🧠 Stream tokenized completions directly into the chat panel
  • 🛠️ Generate entire projects via prompt-driven scaffolding
  • 💡 Inline ghost completions based on your current file and nearby definitions
  • ✍️ Inject safe, rollback-friendly edits into source files
  • 📜 Log all agent-generated edits to timestamped JSON files for review
  • 🔄 Refresh model list from LM Studio server
  • 🛑 Cancel long or runaway streaming prompts
  • 🧾 Visual log panel with time-stamped diagnostic output
  • 🎨 Theme selector: VS Code / Light / Dark
  • 🧠 Language selector for file filtering and prompt context
  • 🔗 Server URL input (default: http://localhost:1234)
  • 📝 Auto-expanding multi-line prompt box
  • 💖 Support links (Buy Me a Coffee + GitHub Sponsors)

🚀 Getting Started

1. Start LM Studio Locally

Make sure LM Studio is running. This extension connects by default to:
http://localhost:1234
You can update this inside the WebView UI using the Server URL field.

2. Launch the Chat View

From the Command Palette (Ctrl+Shift+P), run:

  • LM Studio: Focus on Chat View
  • or View: Toggle LM Studio

You’ll see the chat panel with model selector, prompt box, and control buttons.

🧪 Usage Tips

  • Use 📁 Select Folder to specify the root context for your project
  • Set your desired language using the selector (TS, JS, Python, etc.)
  • Type a prompt such as:
    Create a Python project named hello-bot that logs a message and responds to input
  • The extension will stream the response, extract fenced code blocks, parse filenames (if provided via comments), and save files to a folder inside your selected root
  • All edits are logged in /ignorelogs/ as timestamped .json files
  • Refresh model list anytime using 🔄 Refresh Models
  • Use 🛑 Stop to cancel streaming if the prompt runs long

🧱 Creating a Hello World Project

You can scaffold projects two ways:

A. Via Prompt

Send a prompt like:
Create a C# project named HelloAgent that prints Hello World
Copilot will:

  • Stream the completion
  • Detect fenced code blocks
  • Identify filenames from comments (e.g. # File: main.py)
  • Save each file to /your-folder/HelloAgent/

B. Programmatically via API

Use this function inside your extension code:

-- a triple ` goes here with ts

createHelloWorldProject({ projectRoot: '/path/to/projects', projectName: 'MyNewProject', language: 'ts' });

-- a triple ` goes here

This scaffolds a Hello World project with a single file, chosen based on built-in templates:

  • TypeScript → index.ts
  • JavaScript → index.js
  • Python → main.py
  • C# → Program.cs
  • C++ → main.cpp
  • Go → main.go If no match is found, a TypeScript template is used by default.

Subtypes for each language now exists to help fine tune project types.


💖 Support This Extension

If LM Studio Copilot helps you build or think better, consider supporting ZiCorp’s mission:

  • ☕ Buy Me a Coffee
  • 🌟 Sponsor on GitHub

Your support keeps this extension free, transparent, and evolving.


📜 License

AGPLv3 — open, agentic, and sovereign by design.

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft