Skip to content

[Feature] Improve Unit Test Coverage #20865

@ispobock

Description

@ispobock

Motivation

SGLang's test suite has 600+ test files, but most are E2E tests that launch a full server. While these cover many code paths, core modules like managers/, mem_cache/, entrypoints/, sampling/, parser/, function_call/, and utils/ still need better unit test coverage (tests that run without launching a server).

Why this matters:

  • E2E tests launch a server and load model weights, taking minutes. Unit tests run in seconds with mocked dependencies.
  • When an E2E test fails, the root cause could be anywhere. Unit tests pinpoint the exact broken function.

What's Already Done

Directory Structure

test/registered/unit/          # mirrors python/sglang/srt/
├── mem_cache/                 (6 files)
├── function_call/             (5 files)
├── utils/                     (4 files)
├── parser/                    (3 files)
├── managers/                  (3 files)
├── entrypoints/               (2 files)
├── layers/                    (2 files)
├── observability/             (2 files)
├── model_loader/              (2 files)
├── server_args/               (1 file)
├── model_executor/            (1 file)
├── batch_invariant_ops/       (1 file)
└── ...                        (new subdirs added as tests are written)

Principle: Source file at srt/mem_cache/radix_cache.py → test at unit/mem_cache/test_radix_cache.py.

Open Tasks

Below are some suggested starting points. Comment on this issue to claim one.

You can check the latest UT coverage report run (example) or run coverage locally to find your own opportunities — modules with low coverage under core directories (managers/, sampling/, parser/, etc.) are good candidates:

pytest test/registered/unit/ --cov --cov-config=.coveragerc --cov-report=term-missing -v

Easy (good first issue)

  • srt/parser/ — conversation templates, reasoning parsers, code completion parser
  • srt/sampling/ — parameter validation, normalization, logit processor serialization
  • srt/entrypoints/openai/ — encoding/decoding, tool call formatting
  • srt/function_call/ — extend existing detector tests, add new detector coverage

Medium

  • srt/utils/ — common util helpers
  • srt/multimodal/ — media utils, processor base logic
  • srt/constrained/ — grammar dispatch logic
  • srt/managers/ — template manager, tokenizer utils, batch metadata helpers

How to Contribute

  1. Pick a task from above (or find your own via coverage), comment to claim it
  2. Look at existing examples:
    • test/registered/unit/mem_cache/test_evict_policy.py — pure logic, zero deps
    • test/registered/unit/managers/test_prefill_adder.py — mock factories for scheduler tests
    • test/registered/unit/parser/test_reasoning_parser.py — streaming parser tests
  3. Follow the conventions in test/registered/unit/README.md and the contribution guide
  4. Run locally:
    pytest test/registered/unit/ -v
    pytest test/registered/unit/ --cov --cov-config=.coveragerc -v  # with coverage
  5. Submit a PR titled [Test] Add unit tests for <module_name>, referencing this issue

PR Quality Bar

Note: AI-assisted code is acceptable, but you are responsible for the quality — understand every line you submit. We will reject PRs that look auto-generated without understanding — e.g., tests that only check trivial cases, assert on mock return values, or don't actually exercise the real code logic. Read the source code first, then write tests that would catch real bugs.

PR Requirements

  • Test is in test/registered/unit/<module>/ (mirroring srt/)
  • Does NOT launch a server or load real model weights
  • Includes edge cases, not just happy paths
  • Use CustomTestCase instead of unittest.TestCase.
  • Registered with register_cpu_ci() or register_cuda_ci()
  • Locally tested and passing. Paste both the command and its output in your PR description:
    # command
    pytest test/registered/unit/<your_module>/test_xxx.py -v
    # output (copy-paste the full result)

Metadata

Metadata

Labels

cicontinue integration relatedenhancementNew feature or requestgood first issueGood for newcomers

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions