Skip to content
| Marketplace
Sign in
Visual Studio Code>Other>LocalPythonCodingLLMNew to Visual Studio Code? Get it now.
LocalPythonCodingLLM

LocalPythonCodingLLM

LocalPythonCodingLlm

|
6 installs
| (0) | Free
This Extensions runs an LLM on unit Test and documentation and writes a function that passes the Tests in Python.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

LocalPythonCodingLlm

This extensision runs a LLM model locally with python.You can query it in the Chatwindow.

Example usage

The users asks: @LocalPythonCodingLLM /queryLLM Write a function to count the number of vowels in a given string. test {assert count_vowels("python") == 1, assert count_vowels("a") == 1. The model answers with correct code}

Features

  • offline
  • no data collection
  • open source
  • unlimited queries

keep in mind: Chatgpt/Copilot will be more accurate, since they have more processing power etc ...

Requirements

  • python in the PATH
  • github.copilot-chat extension in VSC
  • a gpu with CUDA is strongly recommended,otherwise the LLM will be very slow.

Disclaimer: This extension generates code automatically. You are solely responsible for reviewing and testing any output before using it in production. No warranty is provided.

Issues

please report issues to https://github.com/manuel-Oelmaier/LocalPythonCodingLlm/issues

Release Notes

0.0.1 extension release

0.0.5 fixed bugs in paticular one preventing it from working properly on windows,added non CUDA support

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft