any-llm version: latest (commit 2acea65)
Python Version: 3.12
Describe the bug
The Mistral provider fails with a TypeError when the user parameter is included in completion requests. This parameter is part of OpenAI's standard API (defined in CompletionParams) but is not supported by Mistral's API.
Error message:
TypeError: Chat.stream_async() got an unexpected keyword argument 'user'
Stack trace:
File "src/any_llm/providers/mistral/mistral.py", line 116, in _stream_completion_async
mistral_stream = await self.client.chat.stream_async(model=model, messages=messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Chat.stream_async() got an unexpected keyword argument 'user'
Root cause:
MistralProvider._convert_completion_params() does not exclude the user parameter when converting CompletionParams to Mistral API format.
Current code (src/any_llm/providers/mistral/mistral.py, line 60-62):
converted_params = params.model_dump(
exclude_none=True, exclude={"model_id", "messages", "response_format", "stream"}
)
The user parameter is defined in CompletionParams (line 129 in src/any_llm/types/completion.py) as part of the OpenAI-compatible interface, but Mistral's API doesn't support it.
Example w/ Code
Reproduction (Python):
from any_llm import completion
response = completion(
model="mistral:mistral-small-latest",
messages=[{"role": "user", "content": "Hello"}],
user="user_123", # This parameter causes the TypeError
stream=True
)
Reproduction (HTTP/Gateway):
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "mistral:mistral-small-latest",
"messages": [{"role": "user", "content": "Hello"}],
"user": "test_user_123",
"stream": true
}'
Expected Behavior:
The request should succeed with the user parameter automatically excluded.
Actual Behavior:
Raises TypeError: Chat.stream_async() got an unexpected keyword argument 'user'
Proposed Fix
Add "user" to the exclude list in src/any_llm/providers/mistral/mistral.py:61:
converted_params = params.model_dump(
exclude_none=True, exclude={"model_id", "messages", "response_format", "stream", "user"}
)
Additional Context
Impact:
- Affects any application using any-llm directly OR through the gateway
- The
user parameter is commonly used for tracking and rate limiting in production systems
- Gateway users are also affected since the gateway uses the core library internally
- Both streaming and non-streaming requests are affected
Related Code:
CompletionParams definition: src/any_llm/types/completion.py:129-130
- Similar handling for
reasoning_effort: src/any_llm/providers/mistral/mistral.py:71-72
Contribution
I have submitted a PR to fix this issue
Checklist:
any-llmversion: latest (commit2acea65)Python Version: 3.12
Describe the bug
The Mistral provider fails with a
TypeErrorwhen theuserparameter is included in completion requests. This parameter is part of OpenAI's standard API (defined inCompletionParams) but is not supported by Mistral's API.Error message:
Stack trace:
Root cause:
MistralProvider._convert_completion_params()does not exclude theuserparameter when convertingCompletionParamsto Mistral API format.Current code (
src/any_llm/providers/mistral/mistral.py, line 60-62):The
userparameter is defined inCompletionParams(line 129 insrc/any_llm/types/completion.py) as part of the OpenAI-compatible interface, but Mistral's API doesn't support it.Example w/ Code
Reproduction (Python):
Reproduction (HTTP/Gateway):
Expected Behavior:
The request should succeed with the
userparameter automatically excluded.Actual Behavior:
Raises
TypeError: Chat.stream_async() got an unexpected keyword argument 'user'Proposed Fix
Add
"user"to the exclude list insrc/any_llm/providers/mistral/mistral.py:61:Additional Context
Impact:
userparameter is commonly used for tracking and rate limiting in production systemsRelated Code:
CompletionParamsdefinition:src/any_llm/types/completion.py:129-130reasoning_effort:src/any_llm/providers/mistral/mistral.py:71-72Contribution
I have submitted a PR to fix this issue
Checklist:
mainbranch (commit2acea65)