Skip to content
| Marketplace
Sign in
Visual Studio Code>AI>StayGreenNew to Visual Studio Code? Get it now.
StayGreen

StayGreen

hritik2002

|
20 installs
| (0) | Free
StayGreen automatically tracks your actual coding work every 30 minutes & pushes a commit thus keeping your GitHub graph consistently green.
Installation
Launch VS Code Quick Open (Ctrl+P), paste the following command, and press enter.
Copied to clipboard
More Info

StayGreen

Keep your GitHub contribution graph consistently green with AI-powered automated commits! StayGreen automatically tracks your actual coding work every 1 hour and creates meaningful commits.

Features

  • 🟩 Automatic commit tracking every 1 hour
  • 🤖 AI-powered commit messages using local Ollama models or GPT-4
  • 📝 Follows conventional commit format
  • 📊 Daily organized summaries
  • 🔒 Secure GitHub OAuth authentication
  • 🎯 Zero configuration needed
  • 🔋 Local AI support with Ollama

Why StayGreen?

  • Show Real Progress: Automatically captures your actual coding work
  • Smart Commits: Uses AI (local or cloud) to generate meaningful commit messages
  • Stay Consistent: Never miss showing your contributions
  • Zero Effort: Works automatically after login
  • Privacy First: Option to use local AI models

Requirements

  • GitHub Account
  • Git installed and configured
  • VS Code version 1.50.0 or higher
  • Either:
    • Ollama installed (for local AI processing)
    • OpenAI API Key (for cloud-based processing)

Installing Ollama

macOS

# Using Homebrew
brew install ollama

# Start Ollama
ollama serve

Linux

# Install using curl
curl -fsSL https://ollama.com/install.sh | sh

# Start Ollama
ollama serve

Windows

  1. Download the installer from Ollama's official website
  2. Run the installer
  3. Ollama will start automatically as a system service

Verifying Installation

ollama --version

Quick Start

  1. Install the extension
  2. Click "Login with GitHub" - the browser will open for secure authentication
  3. Choose your AI provider:
    • Local AI (Recommended):
      1. Install Ollama following the steps above
      2. Start Ollama service if not running
      3. The extension will automatically detect Ollama
      4. Select a model when prompted (llama2, mistral, phi, or neural-chat)
    • Cloud AI: Add OpenAI API key
  4. That's it! StayGreen starts monitoring automatically

Commands

  • StayGreen: Login - Authenticate with GitHub (starts monitoring automatically)
  • StayGreen: Stop - Pause monitoring
  • StayGreen: Start - Resume monitoring
  • StayGreen: Change Ollama Model - Switch between available Ollama models
  • StayGreen: Set OpenAI API Key - Configure cloud AI commit messages

Settings

  • staygreen.schedule: Cron schedule for checking changes (default: "_/30 _ * * *" - every 1 hour)
  • staygreen.ollamaModel: Selected Ollama model for local AI processing
    • llama2: General purpose model (default)
    • mistral: Fast and accurate for technical analysis
    • phi: Small but powerful model for code
    • neural-chat: Optimized for conversation
  • staygreen.openaiApiKey: OpenAI API Key for cloud AI processing (optional)

AI Processing Options

Local Processing with Ollama

  • No API keys needed
  • Full privacy - all processing done locally
  • Choose from multiple optimized models
  • Perfect for sensitive codebases

Cloud Processing with GPT-4

  • No local model downloads needed
  • Faster processing
  • More consistent results
  • Better for resource-constrained systems

Troubleshooting Ollama

Common Issues

  1. "Cannot connect to Ollama" error

    • Ensure Ollama is running with ollama serve
    • Check if the service is running on port 11434
    • Try restarting Ollama
  2. Model download issues

    • Ensure you have stable internet connection
    • Check available disk space
    • Try downloading using command line: ollama pull modelname
  3. Slow responses

    • Consider using a smaller model like 'phi'
    • Check system resources usage
    • Ensure no other intensive processes are running
  4. Windows-specific issues

    • Verify Windows Defender isn't blocking Ollama
    • Run Ollama as administrator
    • Check Windows Services to ensure Ollama service is running
  5. Model not found

    • Try manually pulling the model: ollama pull modelname
    • Verify model name is correct
    • Check Ollama's model list: ollama list

For more detailed troubleshooting, visit Ollama's documentation.

Security

  • Uses secure GitHub OAuth flow
  • No token storage needed
  • Private repository for tracking
  • Local changes only tracked when VS Code is running
  • Option for complete local AI processing

How it Works

  1. Every 1 hour, StayGreen checks for code changes
  2. If changes are found:
    • AI analyzes the changes (using Ollama locally or OpenAI in cloud)
    • Creates a daily organized summary
    • Updates your contribution graph
  3. All summaries are stored in date-organized files
  4. Your coding activity is properly reflected in GitHub

License

MIT

Support

Found a bug or have a suggestion? Directly message me on Twitter or Linkedin


Made with ❤️ by hritik2002

  • Contact us
  • Jobs
  • Privacy
  • Manage cookies
  • Terms of use
  • Trademarks
© 2025 Microsoft