✨ Features
🎯 Core Features
- Multi-AI Support:
- 🌐 OpenAI (Any model)
- 🚀 Ollama (Any model)
- Multi-Language Comments:
- 🇺🇸 English
- 🇨🇳 简体中文
- 🇹🇼 繁體中文
- Real-time Streaming: Watch comments appear in real-time
- Visual Feedback: New comments are highlighted in green
- Non-destructive: Creates a new file for commented code
🛠️ Advanced Features
- Customizable AI Settings:
- Choose between OpenAI and Ollama
- Configure API endpoints
- Select AI models
- Customize prompt templates
- Clean Mode: Option to show only code and comments without additional information
- Smart Comment Styles: Automatically uses appropriate comment syntax for different languages
📋 Requirements
For OpenAI
- OpenAI API key
- Internet connection
For Ollama
- Local Ollama installation
- Compatible AI model (e.g., CodeLlama)
⚙️ Configuration
This extension provides the following settings:
Setting |
Description |
Default |
mouse-commenter.aiType |
AI provider (openai/ollama) |
ollama |
mouse-commenter.apiKey |
OpenAI API key |
"" |
mouse-commenter.endpoint |
Ollama API endpoint |
http://localhost:11434/api/generate |
mouse-commenter.model |
AI model to use |
Depends on provider |
mouse-commenter.baseUrl |
OpenAI API base URL |
https://api.openai.com/v1 |
mouse-commenter.onlyCodeAndComments |
Clean mode without extra info |
false |
🚀 Quick Start
- Install the extension from VS Code marketplace
- Configure your preferred AI provider:
- For OpenAI: Set your API key
- For Ollama: Ensure local server is running
- Open any code file
- Click the "Add AI Comments" button in the editor toolbar
- Select your preferred comment language
- Watch as AI analyzes and comments your code!
🎨 Usage Examples
- Open a code file
- Click the comment icon in the top-right corner
- Select language (English/简体中文/繁體中文)
- Review the generated comments in a new tab
Configuring AI
- Click the settings icon in the top-right corner
- Choose your preferred AI provider
- Configure the relevant settings
- Save and start using!
🔧 Troubleshooting
Common issues and solutions:
- OpenAI Not Working: Check your API key and internet connection
- Ollama Not Responding: Ensure Ollama is running locally (
http://localhost:11434 )
- No Comments Generated: Verify the selected model supports code analysis
📝 Release Notes
See CHANGELOG.md for detailed release notes.
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
👨💻 Author
yeongpin - GitHub
| |