Skip to content

[Bug]: Memory sync session-delta ignores chunking, cascades to trigger compaction #6016

@batumilove

Description

@batumilove

Summary

Large session files cause memory sync to fail with "Input is longer than the context size", which unexpectedly cascades to trigger safeguard compaction in the main agent session, losing user messages.

Bug Chain

  1. Session file grows large (6.7MB in my case from extended debugging session)
  2. User sends a message (voice message in this case)
  3. Memory sync session-delta triggers
  4. Session-delta sync attempts to embed the session data without respecting chunking limits
  5. Embedding model fails: "Input is longer than the context size"
  6. This failure somehow triggers safeguard compaction in the main session
  7. User's message is lost in compaction

Observed Logs

[memory] memory sync failed (session-delta): Error: Input is longer than the context size. Try to increase the context size or use another model that supports longer contexts.
[memory] memory sync failed (session-delta): Error: Input is longer than the context size...
[memory] memory sync failed (watch): Error: Input is longer than the context size...
[agent/embedded] embedded run compaction start: runId=2a3690eb-2a45-4cd2-9ff5-dd23a5a285dd
[agent/embedded] embedded run compaction retry: runId=2a3690eb-2a45-4cd2-9ff5-dd23a5a285dd

This pattern repeated every time the user sent a voice message.

Expected Behavior

  1. Session-delta sync should respect memorySearch.chunking.tokens (configured as 16000)
  2. Memory sync failures should NOT cascade to trigger compaction in the main agent
  3. User messages should never be lost due to background indexing failures

Actual Behavior

  • Session-delta sends the entire session to the embedding model
  • Embedding model (voyage-4-large, 16k context) rejects it
  • Compaction is triggered, wiping context including the just-received message

Workaround

Removed sessions from memorySearch.sources:

{
  "agents": {
    "defaults": {
      "memorySearch": {
        "sources": ["memory"]  // was ["memory", "sessions"]
      }
    }
  }
}

Environment

  • OpenClaw: 2026.1.29 (dev channel)
  • OS: Linux 6.12.42 (x64)
  • Embedding: voyage-4-large via VoyageAI
  • Chunking config: { "tokens": 16000, "overlap": 500 }
  • Session file size: 6.7MB (~6500 lines JSONL)

Relevant Config

{
  "memorySearch": {
    "enabled": true,
    "sources": ["memory", "sessions"],
    "experimental": { "sessionMemory": true },
    "model": "voyage-4-large",
    "chunking": { "tokens": 16000, "overlap": 500 }
  },
  "compaction": {
    "mode": "safeguard",
    "memoryFlush": { "enabled": true }
  }
}

Impact

  • Voice messages (and likely any messages) silently lost
  • User sees "context compacted" message with no indication of the cause
  • Appears as if the system is ignoring their input

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions