[DP Attention] Optimize dp_padding_mode selection for dp_size=1 in extend mode#20406
Merged
ShangmingCai merged 4 commits intosgl-project:mainfrom Mar 16, 2026
Merged
Conversation
…tend mode Signed-off-by: wangfakang <fakangwang@gmail.com>
Contributor
|
Warning You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again! |
yizhang2077
approved these changes
Mar 12, 2026
Collaborator
|
/tag-and-rerun-ci |
Contributor
Author
|
/rerun-failed-ci |
1 similar comment
Contributor
Author
|
/rerun-failed-ci |
Contributor
Author
|
/rerun-failed-ci |
2 similar comments
Contributor
Author
|
/rerun-failed-ci |
Contributor
Author
|
/rerun-failed-ci |
Contributor
Author
|
/rerun_failed_ci |
ShangmingCai
approved these changes
Mar 16, 2026
7 tasks
Wangzheee
pushed a commit
to Wangzheee/sglang
that referenced
this pull request
Mar 21, 2026
…tend mode (sgl-project#20406) Signed-off-by: wangfakang <fakangwang@gmail.com>
0-693
pushed a commit
to 0-693/sglang
that referenced
this pull request
Mar 25, 2026
…tend mode (sgl-project#20406) Signed-off-by: wangfakang <fakangwang@gmail.com>
JustinTong0323
pushed a commit
to JustinTong0323/sglang
that referenced
this pull request
Apr 7, 2026
…tend mode (sgl-project#20406) Signed-off-by: wangfakang <fakangwang@gmail.com>
yhyang201
pushed a commit
to yhyang201/sglang
that referenced
this pull request
Apr 22, 2026
…tend mode (sgl-project#20406) Signed-off-by: wangfakang <fakangwang@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
CC @yizhang2077 @ShangmingCai @nvcastet @ch-wan @merrymercy @Fridge003 PTAL, thx.
Motivation
When dp_size=1, the
MAX_LENandSUM_LENmodes have identical communication overhead since max_len equals sum_len. Previously, extend mode(get_dp_padding_mode) unconditionally usedSUM_LEN, which prevented symmetric memory from being used (via disabled=True).Now with dp_size=1, we prefer
MAX_LENmode to enable symmetric memory optimizations needed for NSA CP and other features.sglang/python/sglang/srt/layers/dp_attention.py
Lines 64 to 68 in abc672e
sglang/python/sglang/srt/layers/dp_attention.py
Lines 119 to 125 in abc672e
sglang/python/sglang/srt/layers/dp_attention.py
Lines 129 to 135 in abc672e
Modifications
Update the logic of
get_dp_padding_mode:SUM_LENfor extend mode whendp_size > 1.>=instead of>).Accuracy Tests
Benchmarking and Profiling
Checklist
Review Process
/tag-run-ci-label,/rerun-failed-ci,/tag-and-rerun-ci