Skip to content

[DTensor] Include Partial placements for scalar tensors in validator#174537

Closed
wconstab wants to merge 5 commits intogh/wconstab/524/basefrom
gh/wconstab/524/head
Closed

[DTensor] Include Partial placements for scalar tensors in validator#174537
wconstab wants to merge 5 commits intogh/wconstab/524/basefrom
gh/wconstab/524/head

Conversation

@wconstab
Copy link
Copy Markdown
Contributor

@wconstab wconstab commented Feb 8, 2026

Stack from ghstack (oldest at bottom):

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

[ghstack-poisoned]
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Feb 8, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/174537

Note: Links to docs will display an error until the docs builds have been completed.

⏳ No Failures, 36 Pending

As of commit 9987957 with merge base f365425 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

… validator"

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

[ghstack-poisoned]
… validator"

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

[ghstack-poisoned]
… validator"

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

[ghstack-poisoned]
… validator"

The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

[ghstack-poisoned]
@wconstab
Copy link
Copy Markdown
Contributor Author

squashed

@wconstab wconstab closed this Feb 11, 2026
sandy-gags pushed a commit to sandy-gags/pytorch that referenced this pull request Mar 12, 2026
The validator excluded Partial placements for 0-dim (scalar) tensors in
both get_1d_input_placements_for_tensor and
get_1d_output_placements_for_tensor. This meant the exhaustive sweep
never tested Partial on scalars, so DTensor rules like P(max),P(max)->
P(max) for maximum with a scalar input were always classified as
"incorrect" even though they are valid.

Partial is meaningful for scalars: P(sum) on a scalar means each rank
holds a partial value that sums to the full scalar.

Authored with Claude.

ghstack-source-id: e36351d
Pull Request resolved: pytorch/pytorch#174537
@github-actions github-actions Bot deleted the gh/wconstab/524/head branch March 14, 2026 02:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants