🐛 Describe the bug
When trying to instantiate Memory locally using Anthropic as the LLM the default configuration fails with below exception:
anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': '`temperature` and `top_p` cannot both be specified for this model. Please use only one.'},
It appears that both can't be set at the same time.
Code to reproduce the bug:
config = {
"llm": {
"provider": "anthropic",
"config": {
"model": "claude-haiku-4-5-20251001"
}
},
"embedder": {
"provider": "ollama",
"config": {
"model": "nomic-embed-text:latest"
}
},
"vector_store": {
"provider": "qdrant",
"config": {
"path": "/tmp/qdrant"
}
}
}
m = Memory.from_config(config)
user_id = "local-dev-user-123"
m.add("My name is John. I am a software developer.", user_id=user_id)
🐛 Describe the bug
When trying to instantiate Memory locally using Anthropic as the LLM the default configuration fails with below exception:
It appears that both can't be set at the same time.
Code to reproduce the bug: