Pair Coding Agent - VS Code Extension
Your intelligent AI pair programming partner with complete privacy using your own fine-tuned local LLM.
🚀 Features
🎯 Intelligent Pair Programming
- Modern AI Interface: Beautiful, responsive interface with professional design
- Unified Experience: Single panel containing all functionality - chat, workspace analysis, auto mode
- Intuitive Design: Clean, modern UI optimized for developer productivity
🔒 Complete Privacy
- 100% Local Processing: Your code never leaves your machine
- Fine-tuned Model: Uses your own locally trained LLM for personalized assistance
- No External Dependencies: Works entirely offline once set up
🤖 Intelligent Features
- Context-Aware Chat: Understands your codebase and provides relevant suggestions
- Auto Mode: Autonomous task execution with step-by-step planning
- Workspace Analysis: Deep understanding of your project structure and dependencies
- Code Generation: Generate, optimize, and refactor code with AI assistance
⚡ Advanced Capabilities
- Real-time Context: Automatically includes relevant files and code in conversations
- Multi-language Support: Works with all major programming languages
- Code Actions: Copy, insert, and modify code directly from chat
- Session Management: Persistent conversations with history
📦 Installation
Prerequisites
- Local LLM Server: Ensure your fine-tuned LLM server is running on
http://localhost:8001
- VS Code: Version 1.80.0 or higher
Install Extension
- Download the
.vsix file from releases
- Open VS Code
- Go to Extensions view (
Ctrl+Shift+X )
- Click "..." menu → "Install from VSIX..."
- Select the downloaded file
Quick Setup
- Open VS Code
- Click the robot icon in the Activity Bar
- The extension will automatically connect to your local LLM server
- Start chatting with your AI assistant!
🔧 Configuration
Settings
- Server URL:
pairCodingAgent.serverUrl (default: http://localhost:8001 )
- Temperature:
pairCodingAgent.temperature (default: 0.7 )
- Max Tokens:
pairCodingAgent.maxTokens (default: 2000 )
- Auto Mode:
pairCodingAgent.autoMode (default: false )
- Context Lines:
pairCodingAgent.contextLines (default: 50 )
Local LLM Server
Ensure your local server provides these endpoints:
GET /health - Health check with model status
POST /api/chat/completions - OpenAI-compatible chat completions
🎯 Usage
Basic Chat
- Click the robot icon in the Activity Bar
- Type your question in the input field
- Get intelligent responses from your fine-tuned model
Auto Mode
- Enable Auto Mode toggle in the chat panel
- Ask for complex tasks like "Create a REST API with authentication"
- Watch as the AI autonomously plans and executes the task
Context Awareness
- The AI automatically includes relevant files from your workspace
- Selected code is automatically included in the context
- Project structure and dependencies are analyzed for better responses
Code Actions
- Copy: Copy generated code to clipboard
- Insert: Insert code at cursor position
- Optimize: Get optimized versions of your code
- Test: Generate unit tests for your functions
🔄 Comparison with Cloud AI Assistants
Feature |
Pair Coding Agent |
Cloud AI Assistants |
Privacy |
✅ 100% Local |
❌ Cloud-based |
Customization |
✅ Fine-tuned Model |
❌ Generic Model |
Design |
✅ Modern Interface |
⚠️ Varies |
Auto Mode |
✅ Full Autonomous |
⚠️ Limited |
Cost |
✅ Free |
❌ Subscription |
Offline |
✅ Works Offline |
❌ Requires Internet |
🛠️ Development
Building from Source
# Clone the repository
git clone <repository-url>
cd augment-ai-clone
# Install dependencies
npm install
# Compile TypeScript
npm run compile
# Package extension
vsce package
Local Development
# Watch mode for development
npm run watch
# Open in VS Code
code .
# Press F5 to launch Extension Development Host
🔧 Troubleshooting
Connection Issues
- Ensure your local LLM server is running on the configured port
- Check the server URL in settings
- Verify the server provides the required endpoints
- Adjust the temperature and max tokens settings
- Reduce context lines if responses are slow
- Check your local model's performance capabilities
Auto Mode Issues
- Ensure Auto Mode is enabled in settings
- Check that the AI service is connected
- Review the task results panel for detailed error information
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🤝 Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
📞 Support
- Issues: Report bugs and feature requests on GitHub
- Documentation: Check the wiki for detailed guides
- Community: Join our Discord for discussions
Experience intelligent pair programming with complete privacy and control using your own fine-tuned local model!
| |