LocalPythonCodingLlmThis extensision runs a LLM model locally with python.You can query it in the Chatwindow. Example usageFeatures
keep in mind: Chatgpt/Copilot will be more accurate, since they have more processing power etc ... Requirements
Disclaimer: This extension generates code automatically. You are solely responsible for reviewing and testing any output before using it in production. No warranty is provided. Issuesplease report issues to https://github.com/manuel-Oelmaier/LocalPythonCodingLlm/issues Release Notes0.0.1 extension release0.0.5 fixed bugs in paticular one preventing it from working properly on windows,added non CUDA support |