An offline ChatGPT-like software integrated to Visual Studios.
EdgeLlama is able to run your llama models (alpaca, vicuna and codellama) directly on your pc without need for internet connection.
Data kept within your PC and safe for use within your organization.
[BACKGROUND]
This is a Visual Studios Professional Port of llama.cpp which aims to bring LLM to edge devices.
This version of edgellama works on Intel Based CPU with AVX-2 support.
[INSTALLATION]
Download a pre-existing Llama.cpp GGUF model from the internet. An example here: codellama-13b.Q2_K.gguf
Copy the model into any folder.
Install EdgeLlamaNet.vsix file
Launch Visual Studios under "Administrative Mode" for the first time to select the newly downloaded model.
Edgellama is found under View Tab > Other Windows > Edgellama
Query your questions in the text field provided and click on the "Ask" button.
Subsequent launch of Visual Studios does not require administrative access.
[Updates 1.0.3]
Added Support for Visual Studios Community, Professional, Enterprise Editions.