Skip to content

fix: use root LLM config as fallback for graph store instead of hardcoded OpenAI default#4466

Merged
whysosaket merged 1 commit intomainfrom
fix/graph-store-llm-config-propagation
Mar 21, 2026
Merged

fix: use root LLM config as fallback for graph store instead of hardcoded OpenAI default#4466
whysosaket merged 1 commit intomainfrom
fix/graph-store-llm-config-propagation

Conversation

@utkarsh240799
Copy link
Copy Markdown
Contributor

Description

If the user provides a root-level llm config (e.g. Anthropic), the graph store ignores it and uses the hardcoded default OpenAI config instead. The user-provided LLM config only works for the graph store if it is redundantly specified under graphStore.llm.

Root cause (two problems in graph_memory.ts):

  1. Default config overridedefaults.ts hardcoded graphStore.llm to openai/gpt-4-turbo-preview. Since ConfigManager.mergeConfig shallow-spreads defaults first, this always overrode the user's root llm.provider.

  2. Provider/config mismatch — The constructor resolved the LLM provider with correct precedence (graphStore.llm → root llm"openai"), but always passed this.config.llm.config (root config) to LLMFactory.create, ignoring graphStore.llm.config entirely.

Fix:

  • Remove the hardcoded default graphStore.llm from defaults.ts so the root llm naturally becomes the fallback.
  • Make the config resolution in graph_memory.ts follow the same precedence as the provider resolution: use graphStore.llm.config when present, fall back to root llm.config.

Fixes #3425

Type of change

  • Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

Added 7 new tests across two test files:

graph-memory-parsing.test.ts — 4 unit tests for MemoryGraph constructor (mocked LLMFactory):

  • Root llm config propagates when no graphStore.llm is set
  • graphStore.llm overrides root llm (both provider and config)
  • Defensive fallback to root config when graphStore.llm.config is undefined
  • Defaults to "openai" when no provider is set anywhere

config-manager.test.ts — 3 end-to-end tests through ConfigManager.mergeConfig:

  • No default graphStore.llm exists after merge (root llm is the fallback)
  • Explicit graphStore.llm is preserved when user provides it
  • No graphStore.llm when user doesn't provide one

Full test suite results (511 tests across 33 suites — all pass):

  • graph-memory-parsing.test.ts — 31 passed
  • config-manager.test.ts — 26 passed
  • graph-prompts.test.ts — 21 passed
  • memory.init.test.ts — 5 passed
  • memory.add.test.ts — 10 passed
  • memory.crud.test.ts — 22 passed
  • 15 other unit test files — 301 passed
  • 7 client test files — 95 passed
  • Build (with prettier check) passes

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective
  • New and existing unit tests pass locally with my changes
  • I have checked my code and corrected any misspellings

Maintainer Checklist

🤖 Generated with Claude Code

…oded OpenAI default (#3425)

The graph store ignored the user's root-level LLM config and always used
a hardcoded OpenAI default. This fixes both the default override and a
provider/config mismatch where the provider could come from graphStore.llm
but the config always came from the root LLM.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@whysosaket whysosaket merged commit 06c25eb into main Mar 21, 2026
11 of 13 checks passed
@kartik-mem0 kartik-mem0 deleted the fix/graph-store-llm-config-propagation branch March 21, 2026 14:09
jamebobob pushed a commit to jamebobob/mem0-vigil-recall that referenced this pull request Mar 29, 2026
…oded OpenAI default (mem0ai#4466)

Co-authored-by: utkarsh240799 <utkarsh240799@users.noreply.github.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Nodejs SDK: Default llm config is used for graph store instead of user's root llm config

3 participants