LineageLens
LineageLens tracks AI-assisted code insertions and links them to provenance context.
Operating Modes
Pure Local Mode (local, default)
- No backend setup required.
- Data is stored locally in VS Code storage (or optional workspace file mode).
- Best for individual and offline workflows.
- Start the extension from Command Palette (
Ctrl+Shift+P) by running Start LineageLens.
Backend Mode (backend)
- Connects to your FastAPI backend for shared/team workflows.
- Enables backend ingest, auth, search, and lineage graph capabilities.
Switch from Pure Local to Backend Mode
- Start your backend service (default:
http://127.0.0.1:8787).
- In VS Code, open Command Palette (
Ctrl+Shift+P).
- Run
AI Provenance: Switch to Backend Mode.
- Run
AI Insertion Detector: Backend Login and authenticate.
- Optional: set backend endpoints in Settings if different from defaults.
Settings JSON Example
{
"aiCodeProvenance.mode": "backend",
"aiInsertionDetector.backend.baseUrl": "http://127.0.0.1:8787",
"aiInsertionDetector.backend.websocketUrl": "ws://127.0.0.1:8787/ws/capture"
}
Useful Commands
Start LineageLens
AI Insertion Detector: Toggle Feature
AI Insertion Detector: Show Status
AI Insertion Detector: Show Provenance
AI Insertion Detector: Open Provenance Search
AI Insertion Detector: Backend Login
Build and Package
npm install
npm run compile
npm run package:vsix
| |