Your AI's Long-Term Memory — A production-ready MCP server that gives Claude Desktop persistent memory across sessions through semantic search, knowledge graphs, and multimodal document processing.
Claude Desktop forgets everything when you close it. Personal Brain fixes that.
Upload documents, save conversations, and build a knowledge graph that Claude can search semantically. Your AI assistant finally remembers what you told it last week.
You: "What did we discuss about the React architecture last month?"
Claude: *searches your personal brain* "On Dec 15, you decided on..."
# Install in 30 seconds
pip install personal-brain-mcpAdd to Claude Desktop config:
{
"mcpServers": {
"personal-brain": {
"command": "personal-brain-mcp",
"args": []
}
}
}Restart Claude Desktop. Done. Your AI now has a brain.
┌─────────────────────────────────────────────────────────────────┐
│ CLAUDE DESKTOP │
│ │ │
│ Model Context Protocol (MCP) │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ PERSONAL BRAIN MCP SERVER │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
│ │ 14 Tools │ │ 5 Resources │ │ FastAPI Server │ │
│ └─────────────┘ └─────────────┘ └─────────────────────────┘ │
│ │ │ │ │
│ ▼ ▼ ▼ │
│ ┌──────────────────────────────────────────────────────────┐ │
│ │ SERVICE LAYER │ │
│ │ ┌────────────┐ ┌────────────┐ ┌────────────────────┐ │ │
│ │ │ RAG Engine │ │ Chat Mgmt │ │ Hybrid Search │ │ │
│ │ └────────────┘ └────────────┘ └────────────────────┘ │ │
│ └──────────────────────────────────────────────────────────┘ │
│ │ │ │ │
│ ▼ ▼ ▼ │
│ ┌──────────────────────────────────────────────────────────┐ │
│ │ DATA LAYER │ │
│ │ ┌────────────┐ ┌────────────┐ ┌────────────────────┐ │ │
│ │ │ Pinecone │ │ NetworkX │ │ File Parsers │ │ │
│ │ │ Vectors │ │ Graph │ │ PDF/OCR/Audio │ │ │
│ │ └────────────┘ └────────────┘ └────────────────────┘ │ │
│ └──────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
| Format | Processing |
|---|---|
| PyPDF2 text extraction | |
| Images | Tesseract OCR |
| Audio | SpeechRecognition transcription |
| Text | Direct chunking |
Not just vector search — we build entity relationships across your documents.
# Example: Find connections between concepts
claude> "How does our auth system relate to the API rate limiting?"
# Hybrid search: vector similarity + graph traversal
# Returns: connected documents with relationship context| Category | Tools |
|---|---|
| Documents | search_documents, get_document_details, list_all_documents, ask_with_citations |
| Chat | search_chat_history, save_chat, retrieve_saved_chats, list_saved_chats |
| Import | import_chat_export, process_chat_command |
| Graph | search_documents_hybrid, explore_entity_relationships, get_graph_statistics |
# Save important conversations
Claude: "Should I save this architecture discussion?"
You: "Yes, tag it as 'backend-design'"
# Retrieve months later
You: "What did we decide about the database schema?"
# Upload papers, notes, transcripts
# Ask questions with citations
You: "What does the literature say about transformer attention?"
Claude: "Based on your uploaded papers... [1][2][3]"
# Migrate from ChatGPT or other Claude sessions
POST /import/chat
# Supports: JSON exports, TXT, HTML- Python 3.8+
- Tesseract OCR (for image processing):
brew install tesseract - API Keys: Google AI, Pinecone (free tiers available)
pip install personal-brain-mcpgit clone https://github.com/anudeepadi/personal-brain-mcp.git
cd personal-brain-mcp
pip install -e .# .env file
GOOGLE_API_KEY=your_google_api_key
PINECONE_API_KEY=your_pinecone_api_key
PINECONE_INDEX_NAME=personal-brain
ANTHROPIC_API_KEY=optional_for_claude_modelThe server also exposes a REST API for programmatic access:
| Endpoint | Method | Description |
|---|---|---|
/upsert |
POST | Upload and process files |
/search |
GET | Semantic search |
/chat |
POST | RAG-based conversation |
/archive/chat |
POST | Save chat sessions |
/import/chat |
POST | Import external chats |
/graph/* |
GET/POST | Knowledge graph operations |
Full OpenAPI docs at http://localhost:8000/docs
- Backend: FastAPI, Python 3.8+
- Vector DB: Pinecone with Google Generative AI embeddings
- LLM: Google Gemini + Anthropic Claude (via LangChain)
- Graph: NetworkX for entity relationships
- MCP: FastMCP for Claude Desktop integration
- Processing: PyPDF2, Tesseract, SpeechRecognition
| Feature | Personal Brain | Obsidian + Plugin | Notion AI | Mem.ai |
|---|---|---|---|---|
| MCP Integration | ✅ Native | ❌ | ❌ | ❌ |
| Self-Hosted | ✅ | ✅ | ❌ | ❌ |
| Knowledge Graph | ✅ | Partial | ❌ | ❌ |
| Multimodal | ✅ PDF/OCR/Audio | ❌ | Partial | Partial |
| Open Source | ✅ MIT | Varies | ❌ | ❌ |
| Cost | Free (self-host) | Free | $10+/mo | $10+/mo |
- Web UI for document management
- Scheduled document sync (Google Drive, Notion)
- Multi-user support with auth
- Local embeddings option (no API required)
- Plugin system for custom processors
Contributions welcome!
# Development setup
pip install -e ".[dev]"
pytest tests/
ruff check app/MIT License - see LICENSE