Skip to content

[Bug] Misleading service prompt: Ollama is running #20484

@whybeyoung

Description

@whybeyoung

Checklist

  • I searched related issues but found no solution.
  • The bug persists in the latest version.
  • Issues without environment info and a minimal reproducible demo are hard to resolve and may receive no feedback.
  • If this is not a bug report but a general question, please start a discussion at https://github.com/sgl-project/sglang/discussions. Otherwise, it will be closed.
  • Please use English. Otherwise, it will be closed.

Bug Description

When starting sglang in HTTP mode, accessing the default root endpoint (http://<ip>:<port>/) returns the message:

Ollama is running

This is misleading and appears to be an incorrect response for an sglang server. Users may be confused because the response suggests that an Ollama service is running instead of sglang.

Steps to Reproduce

  1. Start sglang using HTTP mode.
  2. Open a browser or use curl to access the root endpoint:
http://<ip>:<port>/
  1. Observe the returned response.

Actual Behavior

The server responds with:

Ollama is running
Image

Expected Behavior

The endpoint should either:

  • Return an sglang-specific message (e.g., sglang server is running), or
  • Return a proper API root response / documentation endpoint.

Notes

This response likely comes from a reused or inherited handler and does not reflect the actual running service, which may lead to confusion during debugging or service verification.

CC @hnyls2002 @Kangyan-Zhou @alisonshao

Environment

Any sglang server

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions