Agaze AI HelperAgaze AI Helper is a Visual Studio Code extension that provides AI-powered assistance for developers. This extension allows users to interact with AI models, submit queries, and receive responses directly within the VS Code environment. Features
InstallationTo use Agaze AI Helper, you need to have Ollama installed locally and a model running. 1. Install Ollama LocallyOn macOS & Linux
On Windows (PowerShell)
Verify the installation:
2. Download and Run a ModelAfter installing Ollama, pull a model to use with Agaze AI Helper. For example, to pull and run DeepSeek Chat:
To list available models:
3. Run Ollama as a Local APITo make the model accessible for Agaze AI Helper:
Ollama will now be available at Using Agaze AI Helper in VS Code
Accessing Ollama from a Remote VM or Another DeviceIf you want to access Ollama from another machine or a VM, follow these steps: 1. Allow Remote AccessBy default, Ollama only listens to
2. Open Firewall PortsEnsure your firewall allows incoming connections on port 11434. For Ubuntu/Debian (UFW):
For CentOS/RHEL (firewalld):
For AWS EC2:
3. Connect to Ollama from Another MachineOn another device, replace
If using SSH tunneling for security:
Now, you can access Summary
Now, you're ready to use Agaze AI Helper with your locally running AI model! 🚀 |