-
-
Notifications
You must be signed in to change notification settings - Fork 17.9k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Check Existing Issues
- I have searched the existing issues and discussions.
- I am using the latest version of Open WebUI.
Installation Method
Docker
Open WebUI Version
v0.6.16
Ollama Version (if applicable)
No response
Operating System
MacOS Sequoia
Browser (if applicable)
No response
Confirmation
- I have read and followed all instructions in
README.md. - I am using the latest version of both Open WebUI and Ollama.
- I have included the browser console logs.
- I have included the Docker container logs.
- I have provided every relevant configuration, setting, and environment variable used in my setup.
- I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
- I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
- Start with the initial platform/version/OS and dependencies used,
- Specify exact install/launch/configure commands,
- List URLs visited, user input (incl. example values/emails/passwords if needed),
- Describe all options and toggles enabled or changed,
- Include any files or environmental changes,
- Identify the expected and actual result at each stage,
- Ensure any reasonably skilled user can follow and hit the same issue.
Expected Behavior
I should be able to see debug logs even when streaming responses.
Actual Behavior
When I stream responses (which is default behavior) it throws errors and doesn't process the full response.
Steps to Reproduce
- Run latest with
docker run -d --name openwebui -p 3000:8080 -e GLOBAL_LOG_LEVEL=debug -v openwebui-data:/app/backend/data --restart unless-stopped ghcr.io/open-webui/open-webui:latest - Configure with any model like Cerebras qwen-3-235b-a22b or OpenAI gpt-4
- Run a prompt like 'print a bunch of stuff'
- Notice it does not complete the output
- Notice the errors in the logs as below, primarily the TypeError: not all arguments converted during string formatting
Logs & Screenshots
2025-07-18 20:06:45.722 | ERROR | open_webui.tasks:cleanup_task:88 - Task exception was never retrieved
future: <Task finished name='Task-2730' coro=<process_chat_response.<locals>.response_handler() done, defined at /app/backend/open_webui/utils/middleware.py:1407> exception=TypeError('not all arguments converted during string formatting')> - {}
Traceback (most recent call last):
File "/app/backend/open_webui/utils/middleware.py", line 1800, in stream_body_handler
data = json.loads(data)
│ │ └ '{"id":"chatcmpl-Buldq8WBame0yuxskIc6k70qRn6NK","object":"chat.completion.chunk'
│ └ <function loads at 0x7f91cceb5a80>
└ <module 'json' from '/usr/local/lib/python3.11/json/__init__.py'>
File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
│ │ └ '{"id":"chatcmpl-Buldq8WBame0yuxskIc6k70qRn6NK","object":"chat.completion.chunk'
│ └ <function JSONDecoder.decode at 0x7f91cceb53a0>
└ <json.decoder.JSONDecoder object at 0x7f91cd63b810>
File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
│ │ │ │ └ '{"id":"chatcmpl-Buldq8WBame0yuxskIc6k70qRn6NK","object":"chat.completion.chunk'
│ │ │ └ <built-in method match of re.Pattern object at 0x7f91cce56810>
│ │ └ '{"id":"chatcmpl-Buldq8WBame0yuxskIc6k70qRn6NK","object":"chat.completion.chunk'
│ └ <function JSONDecoder.raw_decode at 0x7f91cceb5440>
└ <json.decoder.JSONDecoder object at 0x7f91cd63b810>
File "/usr/local/lib/python3.11/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
│ │ │ └ 0
│ │ └ '{"id":"chatcmpl-Buldq8WBame0yuxskIc6k70qRn6NK","object":"chat.completion.chunk'
│ └ <_json.Scanner object at 0x7f91ccead4e0>
└ <json.decoder.JSONDecoder object at 0x7f91cd63b810>
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 57 (char 56)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
> File "/app/backend/open_webui/utils/middleware.py", line 2069, in response_handler
await stream_body_handler(response, form_data)
│ │ └ {'stream': True, 'model': 'gpt-4', 'messages': [{'role': 'system', 'content': 'Be who you are designed to be, but dont be my ...
│ └ <starlette.responses.StreamingResponse object at 0x7f917049b0d0>
└ <function process_chat_response.<locals>.response_handler.<locals>.stream_body_handler at 0x7f916bd1c0e0>
File "/app/backend/open_webui/utils/middleware.py", line 2042, in stream_body_handler
log.debug("Error: ", e)
│ └ <function Logger.debug at 0x7f91cd83e2a0>
└ <Logger open_webui.utils.middleware (DEBUG)>
File "/usr/local/lib/python3.11/logging/__init__.py", line 1477, in debug
self._log(DEBUG, msg, args, **kwargs)
│ │ │ │ │ └ {}
│ │ │ │ └ (JSONDecodeError('Unterminated string starting at: line 1 column 57 (char 56)'),)
│ │ │ └ 'Error: '
│ │ └ 10
│ └ <function Logger._log at 0x7f91cd83e980>
└ <Logger open_webui.utils.middleware (DEBUG)>
File "/usr/local/lib/python3.11/logging/__init__.py", line 1634, in _log
self.handle(record)
│ │ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
│ └ <function Logger.handle at 0x7f91cd83ea20>
└ <Logger open_webui.utils.middleware (DEBUG)>
File "/usr/local/lib/python3.11/logging/__init__.py", line 1644, in handle
self.callHandlers(record)
│ │ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
│ └ <function Logger.callHandlers at 0x7f91cd83eca0>
└ <Logger open_webui.utils.middleware (DEBUG)>
File "/usr/local/lib/python3.11/logging/__init__.py", line 1706, in callHandlers
hdlr.handle(record)
│ │ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
│ └ <function Handler.handle at 0x7f91cd83cb80>
└ <InterceptHandler (NOTSET)>
File "/usr/local/lib/python3.11/logging/__init__.py", line 978, in handle
self.emit(record)
│ │ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
│ └ <function InterceptHandler.emit at 0x7f91abcf7740>
└ <InterceptHandler (NOTSET)>
File "/app/backend/open_webui/utils/logger.py", line 64, in emit
level, record.getMessage()
│ │ └ <function LogRecord.getMessage at 0x7f91cd83ae80>
│ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
└ 'DEBUG'
File "/usr/local/lib/python3.11/logging/__init__.py", line 377, in getMessage
msg = msg % self.args
│ │ └ (JSONDecodeError('Unterminated string starting at: line 1 column 57 (char 56)'),)
│ └ <LogRecord: open_webui.utils.middleware, 10, /app/backend/open_webui/utils/middleware.py, 2042, "Error: ">
└ 'Error: '
TypeError: not all arguments converted during string formatting
Additional Information
I noticed that this line in middleware.py is not like the other debug log lines, and it needs to be adjusted to use the 'f' formatter:
open-webui/backend/open_webui/utils/middleware.py
Line 2042 in 2470da8
| log.debug("Error: ", e) |
Should be:
log.debug(f"Error: {e}")
Note that this bug is masking other bugs.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working