Skip to content

OpenClaw integration doesn’t support lmstudio #4235

@jmaver

Description

@jmaver

🐛 Describe the bug

I can use a local qdrant server, but not lmstudio in openclaw config
This works:

"openclaw-mem0": {
        "enabled": true,
        "config": {
          "mode": "open-source",
          "oss": {
            "vectorStore": {
              "provider": "qdrant",
              "config": {
                "host": "192.168.200.12",
                "port": 6333,
                "checkCompatibility": false
              }
            },
            "llm": {
              "provider": "openai",
              "config": {
                "model": "gpt-5-nano",
                "apiKey": "XXX”
              }
            },
            "embedder": {
              "provider": "openai",
              "config": {
                "model": "text-embedding-3-small",
                "apiKey": "XXX”
              }
            }
          }
        }
      }

But swapping embedder to:

            "embedder": {
              "provider": "lmstudio",
              "config": {
                "model": "text-embedding-gte-qwen2-1.5b-instruct",
                "embedding_dims": 1536,
                "lmstudio_base_url": "http://192.168.200.83:1234/v1"
              }

fails with:
2026-03-06T13:03:14.605Z [gateway] openclaw-mem0: recall failed: Error: Unsupported embedder provider: lmstudio

locally, I can use the mem0 library fine with lmstudio:

from mem0 import Memory

config = {
    "embedder": {
        "provider": "lmstudio",
        "config": {
            "model": "text-embedding-gte-qwen2-1.5b-instruct",
            "embedding_dims": 1536,
            "lmstudio_base_url": "http://192.168.200.83:1234/v1"
        }
    },
    "vectorStore": {
        "provider": "qdrant",
        "config": {
            "url": "http:192.168.200.12:6333",
            "collection_name": "memories_local",
            "embedding_model_dims": 1536
        }
    },
    "llm": {
        "provider": "lmstudio",
        "config": {
            "model": "openai/gpt-oss-20b",
            "max_tokens": 2000,
            "lmstudio_base_url": "http://192.168.200.83:1234/v1",
            "lmstudio_response_format": {
                "type": "json_schema",
                "json_schema": {
                    "name": "response",
                    "schema": {"type": "object"}
                }
            }
        }
    }
}

memory = Memory.from_config(config)

# Store a memory
result = memory.add("I love hiking and my favourite food is sushi.", user_id="test_user")
print("Added:", result)

# Retrieve it
results = memory.search("What food does the user like?", user_id="test_user")
for r in results["results"]:
    print("Found:", r["memory"])

Metadata

Metadata

Assignees

No one assigned

    Labels

    P2-mediumAnnoying but has workarounds

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions