Leo is a "Google Antigravity" class AI that runs entirely on your local machine. It allows you to build software, fix bugs, and scaffold projects without sending your code to the cloud.
🚀 True Autonomy (Local)
Autonomous Project Builder: Leo creates files, folders, and code structures automatically.
Universal Local Link: Works with Ollama, LM Studio, LocalAI, or any OpenAI-compatible server.
Privacy Core: By default, Cloud is Optional. 100% of your code stays on your disk.
⚡ "Futuristic" Configuration
Dedicated Settings UI: Click the ⚙️ Gear icon to access the neural configuration panel.
Live Connection Test: Verify your local AI connection instantly.
Manual Overrides: Set any Model Name (e.g. deepseek-coder) and any Base URL.l.
Local AI Integration: Utilizes your local Ollama instance for AI model inference.
Context-Aware: Automatically provides the content of your active editor to the AI.
Safety Guardrails: All AI-suggested code edits require explicit user confirmation.
Configurable: Easily set your Ollama API endpoint and preferred model via VS Code settings.
Getting Started
Install the Extension: Install the .vsix file.
Ensure Ollama is Running: Default URL is http://localhost:11434.
Configure: Go to Settings -> Leo to set your model (default: llama3).
Chat: Click the Leo icon in the Activity Bar to start.
Configuration
You can configure Leo via the Settings UI (Gear Icon) or manually in settings.json:
leo.ai.mode: Choose between 'local' or 'cloud'.
leo.local.baseUrl: Your Ollama/LM Studio URL (default: http://localhost:11434).
leo.local.model: Local model name (e.g. llama3, mistral).
leo.cloud.apiKey: OpenAI/Anthropic API Key.
leo.cloud.provider: Cloud provider (openai or anthropic).