Give your AI tools instant access to your entire coding history. OpenMemory remembers everything you code and automatically provides context to GitHub Copilot, Cursor, Claude, and other AI assistants.
Features
Works with GitHub Copilot, Cursor, Claude, Windsurf, Codex, and any MCP-compatible AI
Auto-configures all AI tools on first run with zero manual setup
Tracks every file edit, save, and open automatically
Compresses memories to reduce tokens by 30-70%
Query responses under 80ms with smart caching
Real-time token savings and compression metrics
Background processing never blocks UI
Quick Start
Install this extension
Start backend
Click OpenMemory icon in status bar to verify connection
Start coding - AI tools now access your coding memory
openmemory.useMCP: Use MCP protocol mode (default: false) - connects to backend MCP server with tools: openmemory_query, openmemory_store, openmemory_list, openmemory_get, openmemory_reinforce
openmemory.mcpServerPath: Path to backend MCP server (default: backend/dist/ai/mcp.js)
Commands
OpenMemory: Query Context - Search your coding memory