ThinkGit
Git for your thinking - Version control for AI conversations in Cursor and VS Code.
ThinkGit captures, indexes, and visualizes your AI coding sessions, making it easy to search through past conversations and understand how your codebase evolved through AI assistance.
Supported editors: Cursor (reads from Cursor's composer database) and VS Code with Copilot Chat (reads from Copilot Chat session files). ThinkGit auto-detects the available conversation source.
Quick Start
Get up and running in 5 minutes:
1. Install Ollama (required for semantic search)
# macOS
brew install ollama
# Or download from https://ollama.ai
# Start Ollama and pull the required models
ollama serve &
ollama pull nomic-embed-text
ollama pull qwen3:4b
2. Install ThinkGit
Search for ThinkGit in the VS Code / Cursor Extensions Marketplace and click Install.
3. Capture Your First Conversation
- Have an AI conversation in Cursor or VS Code Copilot Chat (
Cmd+K or Cmd+L)
- Press
Cmd+Shift+T to capture the session
- Click the ThinkGit icon in the sidebar to see your captured thread
4. Search Your Thinking
- Press
Cmd+Shift+F to search
- Enter a natural language query like "how did I implement authentication?"
- Click a result to view the full conversation
That's it! ThinkGit will now automatically sync your conversations in the background.
Features
- Session Capture: Capture AI conversations from Cursor's composer or VS Code Copilot Chat via conversation picker
- Session pickers are workspace-scoped (only sessions tied to the current repo/workspace are shown)
- Auto-Sync: Automatically syncs conversations in the background — no manual capture needed
- Smart capture logic avoids interrupting agentic loops
- Status bar showing sync state with conversation count
- Auto-Capture on Commit: Automatically captures recent conversations when you make a git commit
- Semantic Search: Natural language queries powered by Ollama embeddings
- Quick Search: Single-step search with inline filters (
tag:, after:, before:, mode:, has:, no:)
- Per-Message Search: Find specific content within conversations
- Advanced Filters: Date range, tags, commit status, AI model
- Cross-Workspace Search: Search across multiple registered workspaces
- HNSW Indexing: Optional sub-50ms search at scale for large stores
- Graph Visualization: Interactive force-directed graph of thinking sessions
- In-Graph Search: Find and cycle through threads by keyword in the toolbar
- Tag Clustering: Group threads by tag with compound nodes, collapse/expand on double-click
- Auto-Scaling: Similarity thresholds adjust automatically based on thread count
- Intent Graph: Extracts and tracks design decisions, preferences, and anti-patterns
- Intent Blame: "Git blame for thinking" — hover over code to see why it was written
- Three-layer attribution (commit-linked, file-reference, LLM-inferred)
- Color-coded status: stable, evolved, superseded
- Context Graph: Structured knowledge extraction with path reasoning and contradiction detection
- Thread Summaries: LLM-generated summaries with fallback chain, displayed in sidebar
- Thread Branching: Track divergent conversation approaches with visual branch indicators
- Export & Comparison: Export to Markdown/HTML, compare two threads side-by-side
- Concept Heatmap: Visualize concept frequency across your codebase
- Playback Simulation: Replay thread changes step-by-step on a temp git branch
- MCP Server: Model Context Protocol integration for AI assistants to query your intent graph
- Multi-Provider LLM: Ollama, Claude, and OpenAI as backends
- Data Integrity: FileLock mutex, atomic writes, consistency checks, retry with exponential backoff
- JSONL Compression: 70%+ storage reduction with transparent gzip
- Cursor Format Detection: Warns on unrecognized Cursor database versions
- Graceful Ollama Degradation: Status bar indicator, offline mode, and setup instructions
- Guided Onboarding: First-run walkthrough, welcome content, and progressive feature discovery
Requirements
- VS Code 1.85.0 or later (or Cursor)
- Ollama running locally for semantic search features
Usage
Keyboard Shortcuts
| Command |
macOS |
Windows/Linux |
| Capture Session |
Cmd+Shift+T |
Ctrl+Shift+T |
| Quick Search |
Cmd+Shift+F |
Ctrl+Shift+F |
| Show Graph |
Cmd+Shift+G |
Ctrl+Shift+G |
Commands
Access via Command Palette (Cmd/Ctrl+Shift+P):
- ThinkGit: Capture Session - Select and save an AI conversation to your thinking history
- ThinkGit: Quick Search - Single-step search with inline filters (
tag:auth after:2026-01 mode:messages)
- ThinkGit: Advanced Search - Multi-step semantic search with filter UI
- ThinkGit: Show Graph - Opens the interactive graph visualization
- ThinkGit: Export Thread - Export a thread to Markdown or HTML format
- ThinkGit: Compare Two Threads - Compare two threads side-by-side
- ThinkGit: Remove Duplicate Threads - Clean up duplicate conversations
- ThinkGit: Delete Thread - Remove a specific thread from history
- ThinkGit: Reindex Embeddings - Regenerate missing embeddings for all threads
- ThinkGit: Generate Summaries - Backfill LLM-generated summaries for existing threads
- ThinkGit: Toggle Intent Blame View Mode - Cycle through intent blame display modes
- ThinkGit: Show Concept Heatmap - Visualize concept frequency across your codebase
- ThinkGit: Getting Started - Open the guided walkthrough
ThinkGit adds a sidebar panel with:
- Think Graph: Interactive visualization of your thinking sessions
- Threads: Chronological list of captured sessions grouped by date, with LLM-generated summaries
What You'll See (First Time)
When you first install ThinkGit and open a workspace:
- Welcome notification - A one-time "Get Started" notification guides you to the walkthrough
- ThinkGit icon appears in the activity bar (left sidebar) - click it to open the panel
- Welcome content in the Threads sidebar with quick-action buttons (Capture Session, Get Started)
- Getting Started walkthrough - A 5-step guided tour (Welcome, Ollama Setup, Capture, Search, Graph)
- Status bar shows Ollama status:
$(check) Ollama - Ollama is running, ready for semantic search
$(warning) Ollama Offline - Start Ollama with ollama serve
.thinkgit/ folder is created in your workspace (add to .gitignore if desired)
After capturing your first conversation (Cmd+Shift+T):
- Welcome content disappears, thread appears in the sidebar list
- Search and Graph walkthrough steps become visible (progressive disclosure)
- Node appears in the Think Graph
- Embeddings are generated in the background (may take a few seconds)
Capturing Sessions
- Have an AI conversation in Cursor or VS Code Copilot Chat
- Run
ThinkGit: Capture Session or press Cmd+Shift+T
- Select the conversation you want to capture from the picker (shows preview and message count)
- The session is saved with automatic tagging and duplicate detection
Searching Past Sessions
Quick Search (recommended):
- Press
Cmd+Shift+F or run ThinkGit: Quick Search
- Type a query with optional inline filters:
authentication tag:auth after:2026-01 mode:messages
- Select a result to view the full conversation
Recent searches are remembered and shown as suggestions.
Advanced Search (multi-step):
- Run
ThinkGit: Advanced Search from the Command Palette
- Choose search mode (threads or messages)
- Optionally apply filters (date range, tags, commit status, model)
- Enter a natural language query
- Select a result to view the full conversation
Auto-Sync (Background Conversation Capture)
ThinkGit automatically syncs your AI conversations in the background — from both Cursor and VS Code Copilot Chat. No manual capture needed — conversations just appear in the graph ready for querying.
How it works:
- ThinkGit polls your conversation source every 30 seconds (configurable)
- Smart capture logic avoids interrupting agentic loops:
- Waits until conversation has no new messages for 2 minutes
- Waits until workspace is quiet (no file saves/edits) for 3 minutes
- Won't capture if last message is from user (awaiting response)
- Force captures after 30 minutes to prevent indefinite delay
- Embeddings are generated in the background for semantic search
Intent Blame (Git Blame for Thinking)
Intent Blame shows you why code was written by connecting code lines to the AI conversations that created them. When you hover over code, you'll see the intent behind it.
How it works:
- Capture AI conversations using ThinkGit (manual or auto-capture)
- Hover over any code line to see if there's an associated intent
- The hover shows:
- The intent/decision that led to this code
- Confidence level (commit-linked, file-reference, or LLM-inferred)
- Status indicator (🟢 Stable, 🟡 Evolved, 🔵 Superseded)
- Link to the original conversation thread
MCP Server (Connect Claude to Your Thinking)
Give Claude access to your past design decisions so it checks context before writing code. When you ask Claude to refactor a module, it can first call get_decisions and get_anti_intent to learn what you chose, what you rejected, and why.
Quick setup for Claude Desktop — add to your claude_desktop_config.json:
{
"mcpServers": {
"thinkgit": {
"command": "node",
"args": [
"/path/to/thinkgit/dist/mcp-server.js",
"/path/to/your/project"
]
}
}
}
Available tools: query_intent, get_decisions, get_anti_intent, get_related, find_path, find_contradictions, get_constraints, query_context
See docs/MCP_GUIDE.md for Claude Code setup, environment variables, example sessions, and troubleshooting.
Configuration
Configure ThinkGit in VS Code Settings (Cmd/Ctrl+,):
| Setting |
Default |
Description |
thinkgit.capture.provider |
auto |
Conversation source: auto, cursor, or vscode-copilot |
thinkgit.ollama.host |
http://localhost:11434 |
Ollama server URL |
thinkgit.ollama.model |
nomic-embed-text |
Model for generating embeddings |
thinkgit.autoCommit.enabled |
true |
Auto-commit AI-applied code changes |
thinkgit.autoCommit.template |
AI apply from ThinkGit... |
Commit message template |
thinkgit.llm.provider |
ollama |
LLM provider: ollama, claude, or openai |
thinkgit.llm.apiKey |
"" |
API key for Claude or OpenAI |
thinkgit.autoCapture.enabled |
true |
Auto-capture conversations on git commit |
thinkgit.autoSync.enabled |
true |
Automatically sync conversations in the background |
thinkgit.mcp.enabled |
false |
Enable MCP server for AI assistant integration |
thinkgit.intentBlame.enabled |
true |
Show intent information on code hover |
thinkgit.contextGraph.enabled |
true |
Enable context graph extraction |
thinkgit.advancedIndexing |
false |
Use HNSW for faster search (>8k embeddings) |
thinkgit.storage.compression |
off |
Compression mode: off or gzip |
See full configuration reference for all settings.
Data Storage
ThinkGit stores data in a .thinkgit folder in your workspace:
.thinkgit/
├── threads.jsonl # Conversation history (append-only)
├── intents.jsonl # Intent graph nodes (design decisions, preferences)
├── edges.jsonl # Intent relationships
├── context/
│ ├── entities.jsonl # Context entities (code symbols, decisions, etc.)
│ └── relations.jsonl # Context edges (relationships between entities)
└── index/
└── embeddings.bin # Vector embeddings for search
Add .thinkgit/ to your .gitignore if you don't want to version control your thinking history.
Troubleshooting
"Ollama is not available"
Ensure Ollama is running:
ollama serve
Check if the model is installed:
ollama list
# Should show nomic-embed-text
"No AI conversation source found"
ThinkGit auto-detects your conversation source. It looks for:
- Cursor:
~/Library/Application Support/Cursor/User/globalStorage/state.vscdb (macOS)
- VS Code Copilot Chat:
~/Library/Application Support/Code/User/workspaceStorage/<hash>/chatSessions/ (macOS)
Make sure:
- Cursor is installed, or you have VS Code with Copilot Chat
- You've had at least one AI conversation to create the database/session files
- Set
thinkgit.capture.provider to force a specific source if auto-detect picks the wrong one
This warning appears when Cursor updates its database schema to a version ThinkGit doesn't recognize. Capture may still work, but consider updating ThinkGit to the latest version. Check the Output panel for format detection logs (e.g., Detected Cursor format: v10+ (_v=10)).
Sessions not capturing
- Ensure you have an active workspace folder open
- Check the Output panel (View > Output) and select "ThinkGit" for logs
- Verify you have AI conversation history (Cursor composer or Copilot Chat)
- For Cursor provider, verify the conversation includes current workspace context (selected files/folders or file references)
- Check
thinkgit.capture.provider setting if auto-detection isn't working
- Try closing and reopening the editor if the database appears locked
Contributing
Contributions are welcome! See README_DEV.md for development setup, architecture, and testing instructions.
Note: Native addon tests (HNSW) are optional and run in a separate job. See README_DEV.md for npm run test:native.
License
MIT License - see LICENSE for details.