BlackBox Labs — VS Code extension to detect provenance of AI Code!
Flag code for AI-provenance.
What it does
- Analyze file or workspace
- Analyze a commit (added lines only) from a list or an interactive commit tree.
- Code highlighting + Markdown report covering AI-Detection.
- Custom thresholds via
bbx-policy.yml.
How to run it
Shortcut: Ctrl+Alt+B opens the action menu (if unbound).
GUI:
- Status Bar (bottom-left): “BlackBox Policy Check” button.
- Editor title bar (top-right): flask icon.
Pick: Audit file, Audit whole repo, Pick commit from list, or Pick from tree.
Requirements
- Open folder/workspace; Git for commit features.
- Internet access
- Languages: Python, Java.
Policy file example
# BlackBox Labs Default Policy
rules:
# --- AI Provenance Thresholds ---
- id: ai_low_risk
type: ai-threshold
severity: low
threshold: 0.30
description: Minor stylistic or token-level AI influence. Logged for provenance only
- id: ai_medium_risk
type: ai-threshold
severity: medium
threshold: 0.55
description: Mixed authorship. Light human review recommended
- id: ai_high_risk
type: ai-threshold
severity: high
threshold: 0.75
description: Strong AI signature. Requires manual inspection
- id: ai_critical_risk
type: ai-threshold
severity: critical
threshold: 0.90
description: Near-pure model generation. Requires audit for logic consistency, licensing, and attribution
| |