Prompt GuardA lightweight VS Code developer tool to prevent accidental leakage of secrets and PII when sharing code with LLMs. Prompt Guard sanitizes sensitive information at copy time and places a safe version on your clipboard, so you can paste into ChatGPT, Copilot, Jira, or docs without exposing credentials, private keys, or customer data. Why Prompt Guard existsDevelopers frequently copy code, logs, and configs into LLMs while debugging or documenting. This is also the most common point where secrets and PII are leaked. Prompt Guard acts as a last-line safety net:
How it works
Your source code remains untouched. What gets detected (by default)
Custom rules (developer-defined)You can add your own redaction rules using regular expressions.
These rules are applied in addition to built-in detection. Configuration
ExampleBefore copy
After copy
Privacy & Security
Prompt Guard is safe to use with production code and sensitive environments. When this is useful
PhilosophyPrompt Guard is designed to protect developers at the exact moment mistakes happen — when copying code fast. If you regularly interact with LLMs and handle sensitive data, this tool gives you a simple, reliable safety net without slowing you down. Clean copy. Zero leaks. Built for LLM-heavy workflows. |