Skip to content

Latest commit

 

History

History
 
 

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Cognee Logo

cognee‑mcp - Run cognee’s memory engine as a Model Context Protocol server

Demo . Learn more · Join Discord · Join r/AIMemory

GitHub forks GitHub stars GitHub commits Github tag Downloads License Contributors

cognee - Memory for AI Agents  in 5 lines of code | Product Hunt

topoteretes%2Fcognee | Trendshift

Build memory for Agents and query from any client that speaks MCP – in your terminal or IDE.

✨ Features

  • Multiple transports – choose Streamable HTTP --transport http (recommended for web deployments), SSE --transport sse (real‑time streaming), or stdio (classic pipe, default)
  • Cloud Mode – connect to Cognee Cloud via --serve-url or COGNEE_SERVICE_URL env var (see Connection Modes)
  • API Mode – connect to an already running Cognee FastAPI server (see Connection Modes)
  • Minimal Memory API – exposes only remember, recall, and forget for agent memory workflows
  • Integrated logging – all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
  • Session-aware memory – store fast session cache entries or permanent graph memory through one remember tool
  • Focused recall – query memory through one recall tool with optional session and search controls
  • Simple deletion – remove a dataset or all owned memory through one forget tool

Please refer to our documentation here for further information.

🚀 Quick Start

  1. Clone cognee repo
    git clone https://github.com/topoteretes/cognee.git
    
  2. Navigate to cognee-mcp subdirectory
    cd cognee/cognee-mcp
    
  3. Install uv if you don't have one
    pip install uv
    
  4. Install all the dependencies you need for cognee mcp server with uv
    uv sync --dev --all-extras --reinstall
    
  5. Activate the virtual environment in cognee mcp directory
    source .venv/bin/activate
    
  6. Set up your OpenAI API key in .env for a quick setup with the default cognee configurations
    LLM_API_KEY="YOUR_OPENAI_API_KEY"
    
  7. Run cognee mcp server with stdio (default)
    python src/server.py
    
    or stream responses over SSE
    python src/server.py --transport sse
    
    or run with Streamable HTTP transport (recommended for web deployments)
    python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
    

You can do more advanced configurations by creating .env file using our template. To use different LLM providers / database configurations, and for more info check out our documentation.

🐳 Docker Usage

If you'd rather run cognee-mcp in a container, you have two options:

  1. Build locally

    1. Make sure you are in /cognee root directory and have a fresh .env containing only your LLM_API_KEY (and your chosen settings).

    2. Remove any old image and rebuild:

      docker rmi cognee/cognee-mcp:main || true
      docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main .
    3. Run it:

      # For HTTP transport (recommended for web deployments)
      docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
      # For SSE transport
      docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
      # For stdio transport (default)
      docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main

      Installing optional dependencies at runtime:

      You can install optional dependencies when running the container by setting the EXTRAS environment variable:

      # Install a single optional dependency group at runtime
      docker run \
        -e TRANSPORT_MODE=http \
        -e EXTRAS=aws \
        --env-file ./.env \
        -p 8000:8000 \
        --rm -it cognee/cognee-mcp:main
      
      # Install multiple optional dependency groups at runtime (comma-separated)
      docker run \
        -e TRANSPORT_MODE=sse \
        -e EXTRAS=aws,postgres,neo4j \
        --env-file ./.env \
        -p 8000:8000 \
        --rm -it cognee/cognee-mcp:main

      Available optional dependency groups:

      • aws - S3 storage support
      • postgres / postgres-binary - PostgreSQL database support
      • neo4j - Neo4j graph database support
      • neptune - AWS Neptune support
      • chromadb - ChromaDB vector store support
      • scraping - Web scraping capabilities
      • distributed - Modal distributed execution
      • langchain - LangChain integration
      • llama-index - LlamaIndex integration
      • anthropic - Anthropic models
      • groq - Groq models
      • mistral - Mistral models
      • ollama / huggingface - Local model support
      • docs - Document processing
      • codegraph - Code analysis
      • monitoring - Sentry & Langfuse monitoring
      • redis - Redis support
      • And more (see pyproject.toml for full list)
  2. Pull from Docker Hub (no build required):

    # With HTTP transport (recommended for web deployments)
    docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
    # With SSE transport
    docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
    # With stdio transport (default)
    docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main

    With runtime installation of optional dependencies:

    # Install optional dependencies from Docker Hub image
    docker run \
      -e TRANSPORT_MODE=http \
      -e EXTRAS=aws,postgres \
      --env-file ./.env \
      -p 8000:8000 \
      --rm -it cognee/cognee-mcp:main

Important: Docker vs Direct Usage

Docker uses environment variables, not command line arguments:

  • ✅ Docker: -e TRANSPORT_MODE=http
  • ❌ Docker: --transport http (won't work)

Direct Python usage uses command line arguments:

  • ✅ Direct: python src/server.py --transport http
  • ❌ Direct: -e TRANSPORT_MODE=http (won't work)

Docker API Mode

To connect the MCP Docker container to a Cognee API server running on your host machine:

Simple Usage (Automatic localhost handling):

# Start your Cognee API server on the host
python -m cognee.api.client

# Run MCP container in API mode - localhost is automatically converted!
docker run \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://localhost:8000 \
  -e API_TOKEN=your_auth_token \
  -p 8001:8000 \
  --rm -it cognee/cognee-mcp:main

Note: The container will automatically convert localhost to host.docker.internal on Mac/Windows/Docker Desktop. You'll see a message in the logs showing the conversion.

Explicit host.docker.internal (Mac/Windows):

# Or explicitly use host.docker.internal
docker run \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://host.docker.internal:8000 \
  -e API_TOKEN=your_auth_token \
  -p 8001:8000 \
  --rm -it cognee/cognee-mcp:main

On Linux (use host network or container IP):

# Option 1: Use host network (simplest)
docker run \
  --network host \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://localhost:8000 \
  -e API_TOKEN=your_auth_token \
  --rm -it cognee/cognee-mcp:main

# Option 2: Use host IP address
# First, get your host IP: ip addr show docker0
docker run \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://172.17.0.1:8000 \
  -e API_TOKEN=your_auth_token \
  -p 8001:8000 \
  --rm -it cognee/cognee-mcp:main

Environment variables for API mode:

  • API_URL: URL of the running Cognee API server
  • API_TOKEN: Authentication token (optional, required if API has authentication enabled)

Note: When running in API mode:

  • Database migrations are automatically skipped (API server handles its own DB)
  • Some features are limited (see API Mode Limitations)

🔗 MCP Client Configuration

After starting your Cognee MCP server with Docker, you need to configure your MCP client to connect to it.

SSE Transport Configuration (Recommended)

Start the server with SSE transport:

docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Configure your MCP client:

Claude CLI (Easiest)

claude mcp add cognee-sse -t sse http://localhost:8000/sse

Verify the connection:

claude mcp list

You should see your server connected:

Checking MCP server health...

cognee-sse: http://localhost:8000/sse (SSE) - ✓ Connected

Manual Configuration

Claude (~/.claude.json)

{
  "mcpServers": {
    "cognee": {
      "type": "sse",
      "url": "http://localhost:8000/sse"
    }
  }
}

Cursor (~/.cursor/mcp.json)

{
  "mcpServers": {
    "cognee-sse": {
      "url": "http://localhost:8000/sse"
    }
  }
}

HTTP Transport Configuration (Alternative)

Start the server with HTTP transport:

docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Configure your MCP client:

Claude CLI (Easiest)

claude mcp add cognee-http -t http http://localhost:8000/mcp

Verify the connection:

claude mcp list

You should see your server connected:

Checking MCP server health...

cognee-http: http://localhost:8000/mcp (HTTP) - ✓ Connected

Manual Configuration

Claude (~/.claude.json)

{
  "mcpServers": {
    "cognee": {
      "type": "http",
      "url": "http://localhost:8000/mcp"
    }
  }
}

Cursor (~/.cursor/mcp.json)

{
  "mcpServers": {
    "cognee-http": {
      "url": "http://localhost:8000/mcp"
    }
  }
}

Dual Configuration Example

You can configure both transports simultaneously for testing:

{
  "mcpServers": {
    "cognee-sse": {
      "type": "sse",
      "url": "http://localhost:8000/sse"
    },
    "cognee-http": {
      "type": "http",
      "url": "http://localhost:8000/mcp"
    }
  }
}

Note: Only enable the server you're actually running to avoid connection errors.

🌐 Connection Modes

The MCP server supports three connection modes:

Direct Mode (Default)

The MCP server directly imports and uses the cognee library with local databases (SQLite, LanceDB, Ladybug). This is the default mode with full feature support.

Cloud Mode

Connect to Cognee Cloud or a remote Cognee instance. The server calls cognee.serve() at startup, and all SDK operations transparently route to the cloud. No local databases needed.

Via CLI flags:

python src/server.py --serve-url https://your-instance.cognee.ai --serve-api-key ck_...

Via environment variables (zero-config):

export COGNEE_SERVICE_URL="https://your-instance.cognee.ai"
export COGNEE_API_KEY="ck_..."
python src/server.py

Cloud Mode with Docker:

docker run \
  -e TRANSPORT_MODE=sse \
  -e COGNEE_SERVICE_URL=https://your-instance.cognee.ai \
  -e COGNEE_API_KEY=ck_... \
  -p 8000:8000 \
  --rm -it cognee/cognee-mcp:main

Cloud Mode arguments / environment variables:

  • --serve-url / COGNEE_SERVICE_URL: Cognee Cloud or remote instance URL
  • --serve-api-key / COGNEE_API_KEY: API key for the instance

Database migrations are automatically skipped in Cloud mode.

API Mode

The MCP server connects to an already running Cognee FastAPI server via HTTP requests. This is useful when:

  • You have a centralized Cognee API server running
  • You want to separate the MCP server from the knowledge graph backend
  • You need multiple MCP servers to share the same knowledge graph

Starting the MCP server in API mode:

# Start your Cognee FastAPI server first (default port 8000)
cd /path/to/cognee
python -m cognee.api.client

# Then start the MCP server in API mode
cd cognee-mcp
python src/server.py --api-url http://localhost:8000 --api-token YOUR_AUTH_TOKEN

API Mode with different transports:

# With SSE transport
python src/server.py --transport sse --api-url http://localhost:8000 --api-token YOUR_TOKEN

# With HTTP transport
python src/server.py --transport http --api-url http://localhost:8000 --api-token YOUR_TOKEN

API Mode with Docker:

# On Mac/Windows (use host.docker.internal to access host)
docker run \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://host.docker.internal:8000 \
  -e API_TOKEN=YOUR_TOKEN \
  -p 8001:8000 \
  --rm -it cognee/cognee-mcp:main

# On Linux (use host network)
docker run \
  --network host \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://localhost:8000 \
  -e API_TOKEN=YOUR_TOKEN \
  --rm -it cognee/cognee-mcp:main

Command-line arguments for API mode:

  • --api-url: Base URL of the running Cognee FastAPI server (e.g., http://localhost:8000)
  • --api-token: Authentication token for the API (optional, required if API has authentication enabled)

Docker environment variables for API mode:

  • API_URL: Base URL of the running Cognee FastAPI server
  • API_TOKEN: Authentication token (optional, required if API has authentication enabled)

API Mode behavior: The MCP server intentionally exposes only the memory API: remember, recall, and forget. In API mode these tools call the Cognee API server endpoints directly. Operational helpers such as cognify, search, list_data, delete, prune, improve, and document retrieval helpers are kept internal and are not exposed as MCP tools.

💻 Basic Usage

The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).

Available Tools

The MCP server exposes three tools:

  • remember: Store data in memory. With session_id: fast session cache. Without session_id: permanent graph memory
  • recall: Search memory with auto-routing. Searches session cache first when session_id is provided, then falls through to the permanent graph
  • forget: Delete memory by dataset name, or delete all owned memory with everything=True

Examples:

# Store permanent memory
remember(data="Cognee MCP now exposes a focused memory API.", dataset_name="main_dataset")

# Store session memory
remember(data="Temporary working note", session_id="agent-session-1")

# Recall from memory
recall(query="What changed in the MCP server?", session_id="agent-session-1")

# Delete one dataset
forget(dataset="main_dataset")

Development and Debugging

Debugging

To use debugger, run: bash mcp dev src/server.py

Open inspector with timeout passed: http://localhost:5173?timeout=120000

To apply new changes while developing cognee you need to do:

  1. Update dependencies in cognee folder if needed
  2. uv sync --dev --all-extras --reinstall
  3. mcp dev src/server.py

Development

In order to use local cognee:

  1. Uncomment the following line in the cognee-mcp pyproject.toml file and set the cognee root path.

    #"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users/<username>/Desktop/cognee"
    

    Remember to replace file:/Users/<username>/Desktop/cognee with your actual cognee root path.

  2. Install dependencies with uv in the mcp folder

    uv sync --reinstall
    

Code of Conduct

We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.

💫 Contributors

contributors

Star History

Star History Chart