Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>http-lm-apiNew to Visual Studio Code? Get it now.
http-lm-api

http-lm-api

flat35hd99

|
10 installs
| (0) | Free
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

VScode extention that serve HTTP API using vscode Language Model API

  • English
  • 日本語

Features

  • Provide API compatible with OpenAI API using VScode Language Model API.
    • You can use the API only install vscode extension.
  • Available all models for GitHub Copilot chat

Motivation

VScode provides Language Models API to vscode extensions. For now, GitHub copilot provides LLM access for fixed fee. If we can use this access via OpenAI compatible HTTP API, we obtain strong power!

Furthermore, someone look llm proxy software that do not require additional installation, this vsvode extension is good solution.

Requirements

This extension uses VScode Language Model API.

If you can use GitHub Copilot, then (probably) you can use this

Extension Settings

name default description
http-lm-api.port 59603 The port number for the API server listening
http-lm-api.startServerAutomatically true If true, start the server automatically after the vscode initialization finished

Specifications

  • OpenAI compatible
    • POST /chat/completion
      • Supporting stream mode.
    • GET /v1/models
    • GET /models
  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft