Memex
AI CLI Session History Manager - On-demand retrieval, precise recall
One Memory. All CLIs. Never Compacted. Exact Search.
What is Memex?
Memex adds long-term memory to AI coding assistants through on-demand search. Instead of losing your valuable conversations, you can search and retrieve precise context whenever you need it.
Supported Tools
- ✅ Claude Code
- ✅ Codex CLI
- ✅ OpenCode
- ✅ Gemini CLI
Features
- On-demand search - You control when to search; automatic injection is opt-in
- Original preservation - Raw messages always kept; summaries are optional layers
- Multi-CLI support - Claude Code, Codex, OpenCode, Gemini in one database
- Powerful search - Full-text (FTS5) + semantic vectors + hybrid ranking
- MCP integration - Search directly from your AI CLI
- REST API - Integrate into any workflow
- Local storage - All data stays on your machine
Why Memex? Read the full explanation in Why Memex?
Quick Start
Homebrew (macOS / Linux)
brew install vimo-ai/tap/memex
# Verify server is running
curl http://localhost:10013/health
# OK
Docker
docker run -d -p 10013:10013 \
-v ~/.vimo:/data \
-v ~/.claude/projects:/claude:ro \
-v ~/.codex:/codex:ro \ # 可选: Codex
-v ~/.local/share/opencode:/opencode:ro \ # 可选: OpenCode
-v ~/.gemini/tmp:/gemini:ro \ # 可选: Gemini
-e OLLAMA_HOST=http://host.docker.internal:11434 \ # 可选: 本机 Ollama (Docker Desktop)
ghcr.io/vimo-ai/memex:latest
Configure MCP (Full / Docker)
claude mcp add memex -- npx -y mcp-remote http://localhost:10013/api/mcp
Then search in your AI CLI:
use memex search "anything you want"
Documentation
- Installation - Docker, build from source, troubleshooting
- Configuration - Environment variables and options
- API Reference - REST API endpoints
- MCP Tools - Claude Code integration
- How it Works - Data flow, architecture, and internals
- Advanced - Claude Code Hooks and more