-
-
Notifications
You must be signed in to change notification settings - Fork 17.9k
Description
Check Existing Issues
- I have searched for any existing and/or related issues.
- I have searched for any existing and/or related discussions.
- I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
- I am using the latest version of Open WebUI.
Installation Method
Docker
Open WebUI Version
v0.7.2
Ollama Version (if applicable)
No response
Operating System
Fedora 43
Browser (if applicable)
No response
Confirmation
- I have read and followed all instructions in
README.md. - I am using the latest version of both Open WebUI and Ollama.
- I have included the browser console logs.
- I have included the Docker container logs.
- I have provided every relevant configuration, setting, and environment variable used in my setup.
- I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
- I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
- Start with the initial platform/version/OS and dependencies used,
- Specify exact install/launch/configure commands,
- List URLs visited, user input (incl. example values/emails/passwords if needed),
- Describe all options and toggles enabled or changed,
- Include any files or environmental changes,
- Identify the expected and actual result at each stage,
- Ensure any reasonably skilled user can follow and hit the same issue.
Expected Behavior
no ANSI colors inside the code send to Jupyter.
Actual Behavior
Using a prompt to generate some code and execute it sometimes work, like here:
# Calculate and print the first 10 Fibonacci numbers
def fib(n):
a, b = 0, 1
out = []
for _ in range(n):
out.append(a)
a, b = b, a + b
return out
print("First 10 Fibonacci numbers:")
print(fib(10))
STDOUT/STDERR
First 10 Fibonacci numbers:
[0, 1, 1, 2, 3, 5, 8, 13, 21, 34]But sometimes the code doesn't work because there are ASCII escape sequences that were submitted at the prompt:
def fib(n):
...
print(...)
STDOUT/STDERR
�[36mCell�[39m�[36m �[39m�[32mIn[1]�[39m�[32m, line 3�[39m
�[31m �[39m�[31mprint(...)�[39m
^
�[31mIndentationError�[39m�[31m:�[39m expected an indented block after function definition on line 1In particular, all the times I ask something more complex, like runing the example at:
https://docs.openwebui.com/tutorials/integrations/jupyter#create-a-visualization
it ends adding ANSI colors at the output, causing it to fail.
Steps to Reproduce
- Install 3 separate containers in docker:
- ollama - built from https://github.com/ollama/ollama.git DockerFile
- open-webui - from ghcr.io/open-webui/open-webui:main
- junyper - from quay.io/jupyter/scipy-notebook
-
setup it to use Junyper integration for code execution
-
use a prompt that will try to use code executer.
Logs & Screenshots
Logs at Ollama, Jupyter and Open WebUI are OK, nothing weird reported there.
Jupyter log shows that the Kernel was properly created and executed:
[I 2026-02-01 15:07:30.856 ServerApp] Kernel started: 39860cb7-88e0-4215-ab1c-3a21bb72df7c
[W 2026-02-01 15:07:30.858 ServerApp] No session ID specified
[I 2026-02-01 15:07:31.156 ServerApp] Adapting from protocol version 5.3 (kernel 39860cb7-88e0-4215-ab1c-3a21bb72df7c) to 5.4 (client).
[I 2026-02-01 15:07:31.156 ServerApp] Connecting to kernel 39860cb7-88e0-4215-ab1c-3a21bb72df7c.
[I 2026-02-01 15:07:31.224 ServerApp] Starting buffering for 39860cb7-88e0-4215-ab1c-3a21bb72df7c:6e2de725-c467663054dfc536ed1ca424
[I 2026-02-01 15:07:31.225 ServerApp] Kernel shutdown: 39860cb7-88e0-4215-ab1c-3a21bb72df7c
Open WebUI logs:
2026-02-01 15:07:30.966 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 127.0.0.1:0 - "GET /api/v1/chats/05e01787-c066-4cb7-9e0a-a3abd572beec HTTP/1.1" 200
2026-02-01 15:07:31.156 | INFO | open_webui.utils.code_interpreter:execute_in_jupyter:172 - sending JSON message: ('{"header": {"msg_id": "04df4727140441c8bd216d3401a54f43", "msg_type": '
'"execute_request", "username": "user", "session": '
'"d7cf310dd9e546caa56c9f59acf78148", "date": "2026-02-01T15:07:31.156792", '
'"version": "5.3"}, "parent_header": {}, "metadata": {}, "content": {"code": '
'"def print_hello():\\n print(\\"Hello, World!\\")\\nprint_hello()", '
'"silent": false, "store_history": true, "user_expressions": {}, '
'"allow_stdin": false, "stop_on_error": true}, "channel": "shell"}')
2026-02-01 15:07:31.157 | INFO | open_webui.utils.code_interpreter:execute_in_jupyter:176 - waiting answer
2026-02-01 15:07:31.224 | INFO | open_webui.utils.code_interpreter:run:83 - code executed.
Ollama logs:
[GIN] 2026/02/01 - 15:07:28 | 200 | 553.150516ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:29 | 200 | 99.470281ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:29 | 200 | 97.065211ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:30 | 200 | 89.458762ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:30 | 200 | 84.068772ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:31 | 200 | 87.481288ms | 172.17.0.3 | POST "/api/chat"
[GIN] 2026/02/01 - 15:07:31 | 200 | 537.2191ms | 172.17.0.3 | POST "/api/chat"
Additional Information
No response