LLM ProxyThis extension allows any BYOK (Bring Your Own Key) AI editor or extension, such as Cursor or Continue, to connect to any HTTP-compatible LLM by aliasing it as a different model (e.g.,
Why Use LLM Proxy?Many AI-powered IDEs and extensions that support BYOK still restrict access to advanced features—such as Agent mode, image and voice inputs, or multi-modal capabilities—to a small, hardcoded list of models. This means that even if you have access to a powerful open-source model, you might not be able to use it to its full potential. LLM Proxy solves this problem by acting as a middleman. It intercepts requests from your editor, swaps the model alias with the real model name you want to use, and forwards the request to your self-hosted or cloud-hosted LLM. This tricks the editor into thinking you are using a supported model, unlocking all its features for use with any compatible LLM. Features
Setup
UsageOnce the proxy is running, configure your AI extension to use the local proxy server:
That's it! Your editor will now send requests to the local proxy, which will forward them to your chosen LLM while unlocking all its features. LicenseThis extension is licensed under the MIT License. See the LICENSE file for details. |