Summary
GoogleLLM.generateResponse in mem0ai/dist/oss/index.js does not pass the tools parameter to Gemini's generateContent API. Graph memory operations that require tool calls (extract_entities, establish_relationships, delete_graph_memory) always return plain text — zero entities are extracted and Neo4j is never populated. No error is thrown; the failure is completely silent.
Environment
- mem0ai version: 2.4.0 (JS/TS OSS package)
- Node.js: 22.x
- LLM provider:
google (Gemini Flash)
- Graph store: Neo4j (bolt, self-hosted)
- Vector store: Qdrant
Steps to Reproduce
Configure mem0 OSS with Google as the LLM provider and Neo4j as the graph store:
import { Memory } from 'mem0ai/oss';
const memory = new Memory({
enableGraph: true,
llm: {
provider: 'google',
config: {
model: 'gemini-2.0-flash',
apiKey: process.env.GOOGLE_API_KEY,
},
},
graphStore: {
provider: 'neo4j',
config: {
url: 'bolt://localhost:7687',
username: 'neo4j',
password: 'password',
},
},
embedder: {
provider: 'ollama',
config: { model: 'nomic-embed-text:latest' },
},
});
await memory.add('I live in Baltimore and work in commercial real estate.', { userId: 'user1' });
Check Neo4j — zero nodes created despite successful vector store writes.
Root Cause
In dist/oss/index.js, GoogleLLM.generateResponse receives a tools parameter but never passes it to the Gemini API:
// CURRENT (buggy):
async generateResponse(messages, responseFormat, tools) {
const completion = await this.google.models.generateContent({
contents: messages.map((msg) => ({ ... })),
model: this.model,
// tools is silently ignored — never forwarded
});
const text = completion.text?.replace(/^```json\n/, '').replace(/\n```$/, '');
return text || '';
}
When MemoryGraph calls structuredLlm.generateResponse(messages, format, tools) for entity/relationship extraction, Google returns plain text instead of a function call response. The caller checks response.toolCalls which is undefined, so entities = [] and nothing is written to Neo4j.
Proposed Fix
Pass functionDeclarations to Gemini and parse functionCall parts from the response:
async generateResponse(messages, responseFormat, tools) {
const systemMsg = messages.find((m) => m.role === 'system');
const nonSystemMsgs = messages.filter((m) => m.role !== 'system');
const requestParams = {
contents: nonSystemMsgs.map((msg) => ({
parts: [{ text: typeof msg.content === 'string' ? msg.content : JSON.stringify(msg.content) }],
role: msg.role === 'assistant' ? 'model' : 'user',
})),
model: this.model,
config: {},
};
if (systemMsg) {
requestParams.config.systemInstruction = typeof systemMsg.content === 'string'
? systemMsg.content
: JSON.stringify(systemMsg.content);
}
if (tools && tools.length > 0) {
requestParams.config.tools = [{
functionDeclarations: tools.map((t) => ({
name: t.function.name,
description: t.function.description || '',
parameters: t.function.parameters || {},
})),
}];
}
const completion = await this.google.models.generateContent(requestParams);
const parts = completion.candidates?.[0]?.content?.parts;
if (parts) {
const fnCalls = parts.filter((p) => p.functionCall);
if (fnCalls.length > 0) {
return {
content: '',
role: 'assistant',
toolCalls: fnCalls.map((p) => ({
name: p.functionCall.name,
arguments: typeof p.functionCall.args === 'string'
? p.functionCall.args
: JSON.stringify(p.functionCall.args),
})),
};
}
}
const text = completion.text;
return text?.replace(/^```json\n/, '').replace(/\n```$/, '') || '';
}
Related Issues
Impact
Anyone using Google/Gemini as their LLM provider with graph memory enabled gets silently broken graph extraction. Vector store writes succeed, giving the false impression that everything is working. No error is surfaced.
Summary
GoogleLLM.generateResponseinmem0ai/dist/oss/index.jsdoes not pass thetoolsparameter to Gemini'sgenerateContentAPI. Graph memory operations that require tool calls (extract_entities,establish_relationships,delete_graph_memory) always return plain text — zero entities are extracted and Neo4j is never populated. No error is thrown; the failure is completely silent.Environment
google(Gemini Flash)Steps to Reproduce
Configure mem0 OSS with Google as the LLM provider and Neo4j as the graph store:
Check Neo4j — zero nodes created despite successful vector store writes.
Root Cause
In
dist/oss/index.js,GoogleLLM.generateResponsereceives atoolsparameter but never passes it to the Gemini API:When
MemoryGraphcallsstructuredLlm.generateResponse(messages, format, tools)for entity/relationship extraction, Google returns plain text instead of a function call response. The caller checksresponse.toolCallswhich is undefined, soentities = []and nothing is written to Neo4j.Proposed Fix
Pass
functionDeclarationsto Gemini and parsefunctionCallparts from the response:Related Issues
structuredLlmhardcoded toopenai_structured). That issue covers Groq/Anthropic via a hardcoded provider string; this issue affects Google specifically because thetoolsparameter is never forwarded to the Gemini API regardless of provider routing.Impact
Anyone using Google/Gemini as their LLM provider with graph memory enabled gets silently broken graph extraction. Vector store writes succeed, giving the false impression that everything is working. No error is surfaced.