Overview Version History Q & A Rating & Review
StayGreen
Keep your GitHub contribution graph consistently green with AI-powered automated commits! StayGreen automatically tracks your actual coding work every 1 hour and creates meaningful commits.
Features
🟩 Automatic commit tracking every 1 hour
🤖 AI-powered commit messages using local Ollama models or GPT-4
📝 Follows conventional commit format
📊 Daily organized summaries
🔒 Secure GitHub OAuth authentication
🎯 Zero configuration needed
🔋 Local AI support with Ollama
Why StayGreen?
Show Real Progress : Automatically captures your actual coding work
Smart Commits : Uses AI (local or cloud) to generate meaningful commit messages
Stay Consistent : Never miss showing your contributions
Zero Effort : Works automatically after login
Privacy First : Option to use local AI models
Requirements
GitHub Account
Git installed and configured
VS Code version 1.50.0 or higher
Either:
Ollama installed (for local AI processing)
OpenAI API Key (for cloud-based processing)
Installing Ollama
macOS
# Using Homebrew
brew install ollama
# Start Ollama
ollama serve
Linux
# Install using curl
curl -fsSL https://ollama.com/install.sh | sh
# Start Ollama
ollama serve
Windows
Download the installer from Ollama's official website
Run the installer
Ollama will start automatically as a system service
Verifying Installation
ollama --version
Quick Start
Install the extension
Click "Login with GitHub" - the browser will open for secure authentication
Choose your AI provider:
Local AI (Recommended) :
Install Ollama following the steps above
Start Ollama service if not running
The extension will automatically detect Ollama
Select a model when prompted (llama2, mistral, phi, or neural-chat)
Cloud AI : Add OpenAI API key
That's it! StayGreen starts monitoring automatically
Commands
StayGreen: Login
- Authenticate with GitHub (starts monitoring automatically)
StayGreen: Stop
- Pause monitoring
StayGreen: Start
- Resume monitoring
StayGreen: Change Ollama Model
- Switch between available Ollama models
StayGreen: Set OpenAI API Key
- Configure cloud AI commit messages
Settings
staygreen.schedule
: Cron schedule for checking changes (default: "_/30 _ * * *" - every 1 hour)
staygreen.ollamaModel
: Selected Ollama model for local AI processing
llama2
: General purpose model (default)
mistral
: Fast and accurate for technical analysis
phi
: Small but powerful model for code
neural-chat
: Optimized for conversation
staygreen.openaiApiKey
: OpenAI API Key for cloud AI processing (optional)
AI Processing Options
Local Processing with Ollama
No API keys needed
Full privacy - all processing done locally
Choose from multiple optimized models
Perfect for sensitive codebases
Cloud Processing with GPT-4
No local model downloads needed
Faster processing
More consistent results
Better for resource-constrained systems
Troubleshooting Ollama
Common Issues
"Cannot connect to Ollama" error
Ensure Ollama is running with ollama serve
Check if the service is running on port 11434
Try restarting Ollama
Model download issues
Ensure you have stable internet connection
Check available disk space
Try downloading using command line: ollama pull modelname
Slow responses
Consider using a smaller model like 'phi'
Check system resources usage
Ensure no other intensive processes are running
Windows-specific issues
Verify Windows Defender isn't blocking Ollama
Run Ollama as administrator
Check Windows Services to ensure Ollama service is running
Model not found
Try manually pulling the model: ollama pull modelname
Verify model name is correct
Check Ollama's model list: ollama list
For more detailed troubleshooting, visit Ollama's documentation .
Security
Uses secure GitHub OAuth flow
No token storage needed
Private repository for tracking
Local changes only tracked when VS Code is running
Option for complete local AI processing
How it Works
Every 1 hour, StayGreen checks for code changes
If changes are found:
AI analyzes the changes (using Ollama locally or OpenAI in cloud)
Creates a daily organized summary
Updates your contribution graph
All summaries are stored in date-organized files
Your coding activity is properly reflected in GitHub
License
MIT
Support
Found a bug or have a suggestion? Directly message me on Twitter or Linkedin
Made with ❤️ by hritik2002