Persistent Memory
for AI Agents.
SuperMemory gives your AI semantic memory that persists across sessions. Store knowledge, recall context, and build on past conversations.
How It Works
Modern retrieval technology for fast, accurate memory recall.
Vector Embeddings
Memories are embedded as vectors using OpenAI or Gemini models. Search finds memories by meaning, not exact keywords.
Cross-Encoder Reranking
Two-stage retrieval: fast vector similarity first, then cross-encoder reranking for precision. Returns the most relevant memories.
Flexible Storage
Run locally with SQLite for full privacy, or deploy to the cloud with Firestore. Same MCP interface either way.
6 Ways AI Memory Helps You
Real examples of what SuperMemory does for your AI assistant.
Remember Your Preferences
Your AI learns your coding style, tool preferences, and workflow habits once — and remembers them forever.
Build on Past Conversations
Pick up exactly where you left off. Your AI recalls decisions, discussions, and context from previous sessions.
Store Learned Skills
Teach your AI a procedure once — deploy scripts, debugging workflows, build steps — and it remembers how.
Track Project Context
Architecture decisions, file conventions, API patterns — your AI keeps a living knowledge base of your project.
Cross-Session Debugging
Remember past bugs, solutions, and workarounds. Your AI never solves the same problem twice from scratch.
Personal Knowledge Base
Store research, meeting notes, reference material — anything you want your AI to know without re-explaining.
Ready to try it?
Add persistent memory to your AI in under a minute.
Your Data, Protected
Privacy-first architecture with EU compliance built in.
All cloud infrastructure is hosted in Frankfurt, Germany. Your memories never leave the EU. You retain full control of your data—delete any memory or everything at any time. Or run locally with SQLite for complete air-gapped privacy.
Setup Guide
How SuperMemory Works
SuperMemory is an MCP server that gives any AI assistant persistent semantic memory. It runs alongside your AI client and provides three tools:
- store_memory — Save any text with optional tags. The server generates vector embeddings automatically.
- retrieve_memory — Search by meaning (semantic search) or by ID. Filter by tag. Results are reranked for precision.
- delete_memory — Remove outdated memories to keep your knowledge base clean.
Your AI learns to use these tools naturally. Say "remember this for next time" and it stores a memory. Ask about something from weeks ago and it retrieves the relevant context.
› Local vs. Cloud
Local (stdio): Run with npx — memories stored in SQLite on your machine. Fully private, no cloud needed. Requires an OpenAI or Gemini API key for embeddings.
Cloud (HTTP): Connect to the hosted server — memories stored in Firestore. Available across all sessions and devices.
Claude Desktop
Works with any Claude Desktop installation.
- Open Claude Desktop → Settings → Developer → Edit Config
- Add this to
claude_desktop_config.json:
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["@nicepkg/supermemory"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
} - Replace
sk-...with your OpenAI API key (for embeddings) - Restart Claude Desktop
› Use Gemini embeddings instead
Replace the env block with:
"EMBEDDING_PROVIDER": "gemini", "GEMINI_API_KEY": "your-key" Claude (Web)
Requires Claude Pro or Team subscription.
- Go to claude.ai → Settings → Integrations
- Click Add Integration
- Enter this URL:
https://mcp.supermemory.13afoundry.com/mcp The hosted server uses Firestore for cloud-persistent storage. Your memories are available across all sessions.
ChatGPT
Requires ChatGPT Pro, Business, Enterprise, or Edu plan. Web only.
- Enable Developer Mode:
- Go to Settings → Apps → Advanced Settings
- Turn on Developer Mode
- Click the button below to create an app
- Fill in these details:
- Name: SuperMemory
- MCP Server URL:
https://mcp.supermemory.13afoundry.com/mcp - Authentication: None
- Click "I understand" on the safety warning
› Troubleshooting
Can't find Developer Mode?
Only workspace admins can enable it. Ask your admin to enable it in Workspace Settings → Permissions & Roles → Connected Data Developer mode.
Safety warning?
It's normal to see "OpenAI hasn't reviewed this MCP server". Click "I understand" to continue.
Cursor
Works with any Cursor plan.
- Open your MCP config file:
- Mac:
~/.cursor/mcp.json - Windows:
%USERPROFILE%\.cursor\mcp.json
- Mac:
- Add this configuration (or create the file if it doesn't exist):
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["@nicepkg/supermemory"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
} - Replace
sk-...with your OpenAI API key - Restart Cursor
› Troubleshooting
MCP not showing up?
- Make sure the config file is valid JSON (no trailing commas)
- Fully restart Cursor (quit and reopen)
- Check that
npxis available (Node.js installed)
Need Node.js?
Download from nodejs.org (LTS version recommended)
Gemini CLI
› Setup Instructions
- Install Gemini CLI
- Edit
~/.gemini/settings.json - Add this configuration:
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.supermemory.13afoundry.com/mcp"]
}
}
} - Run
/mcpin Gemini CLI to connect
Other Apps
› Generic Setup
SuperMemory works with any MCP-compatible app.
For local use (stdio):
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["@nicepkg/supermemory"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
} For cloud use (HTTP):
- Server URL:
https://mcp.supermemory.13afoundry.com/mcp - Transport: Streamable HTTP