🧠 AI Code Assistant
Intelligent code generation and analysis powered by local AI models
Features •
Installation •
Getting Started •
Support
✨ Features
| Feature |
Description |
| 🧠 Smart Code Analysis |
AI-powered understanding of your codebase with context awareness |
| ✨ Code Generation |
Create files directly from natural language descriptions |
| 📋 Visual Task Planning |
Real-time progress tracking with step-by-step execution |
| 🔄 Project Indexing |
Full project context for accurate suggestions |
| 💬 Chat Interface |
Modern, minimalistic chat experience |
| 📚 Chat History |
Save and restore previous conversations |
| ⚡ Local AI Models |
Privacy-first with Ollama integration |
🚀 Installation
Method 1: VS Code Marketplace (Recommended)
- Open VS Code
- Click the Extensions icon in the Activity Bar (or press
Cmd+Shift+X)
- Search for "FinAI Code Assistant"
- Click Install
Or install directly from the VS Code Marketplace
Method 2: Command Line
ext install issamnaim.finailabz-ai-code-assistant
Prerequisites
- VS Code v1.74.0 or higher
- Ollama installed and running
- A compatible AI model (e.g.,
deepseek-coder:33b, codellama)
🎯 Getting Started
- Open AI Assistant — Click the robot icon in the activity bar
- Configure Model — Go to Settings tab and select your Ollama model
- Start Coding — Ask the AI to create, analyze, or modify code
Example Prompts
Create a TypeScript REST API with Express and authentication
Analyze this function and suggest performance improvements
Refactor this code to use async/await instead of callbacks
⌨️ Keyboard Shortcuts
| Shortcut |
Action |
Cmd+Alt+A |
Analyze Code |
Cmd+Alt+G |
Generate Code |
Cmd+Alt+C |
Open Chat Panel |
⚙️ Configuration
Access settings through the Settings tab in the AI Assistant panel:
| Setting |
Description |
Default |
| Ollama URL |
Local Ollama server address |
http://localhost:11434 |
| Default Model |
AI model to use |
deepseek-coder:33b |
| Temperature |
Response creativity (0-1) |
0.7 |
🛡️ Privacy & Security
Your code stays on your machine. AI Code Assistant uses local Ollama models, ensuring:
- ✅ No cloud data transmission
- ✅ Complete privacy for proprietary code
- ✅ Offline capability
- ✅ Full control over your data
Get Help
|
Report Issues
- 🐛 Bug Reports: GitHub Issues
- 💡 Feature Requests: support@finailabz.com
- 🔒 Security: security@finailabz.com
|
📄 License
This software is proprietary and confidential. Unauthorized copying, distribution, or modification is strictly prohibited.
Built with ❤️ by FinAI Labz
© 2024-2026 FinAI Labz. All rights reserved.
Website •
Twitter •
LinkedIn