backseat-pilotvscode extension for using local LLMs with vscode serverbased on the llama-cpp-python server example. extensionLLM url is configuraable:
Publish your own version
usageCommands based on ai_extension_vscode. cmd shift p => |