Skip to content

[DTensor] single-dim expander raises clear inplace error#173572

Closed
wconstab wants to merge 2 commits intogh/wconstab/508/basefrom
gh/wconstab/508/head
Closed

[DTensor] single-dim expander raises clear inplace error#173572
wconstab wants to merge 2 commits intogh/wconstab/508/basefrom
gh/wconstab/508/head

Conversation

@wconstab
Copy link
Copy Markdown
Contributor

@wconstab wconstab commented Jan 27, 2026

Stack from ghstack (oldest at bottom):

Inplace ops for dtensor have a restriction: you're not allowed to
redistribute the 'inplace' tensor. This means in some cases, sharding
propagation has to fail becuase the inplace input is not compatible with
any of the possible sharding strategies.

This PR makes sure this case raises the expected informative error
rather than a confusing error about selecting min cost over an
empty sharding strategies list.

Inplace ops for dtensor have a restriction: you're not allowed to
redistribute the 'inplace' tensor.  This means in some cases, sharding
propagation has to fail becuase the inplace input is not compatible with
any of the possible sharding strategies.

This PR makes sure this case raises the expected informative error
rather than a confusing error about selecting min cost over an
empty sharding strategies list.

[ghstack-poisoned]
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Jan 27, 2026

This PR needs a release notes: label

If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Jan 27, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/173572

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 2bb4e85 with merge base 7754b55 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

f"{op_schema.op}: in-place operations that require placement changes "
f"are not supported. The input has placement {blocking_inplace_input_placements}, "
f"but no valid strategy preserves this placement. "
f"Please use the out-of-place version of this operation instead."
Copy link
Copy Markdown
Contributor

@pianpwk pianpwk Jan 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: or redistribute to {one of the possible placements}?

Inplace ops for dtensor have a restriction: you're not allowed to
redistribute the 'inplace' tensor.  This means in some cases, sharding
propagation has to fail becuase the inplace input is not compatible with
any of the possible sharding strategies.

This PR makes sure this case raises the expected informative error
rather than a confusing error about selecting min cost over an
empty sharding strategies list.

[ghstack-poisoned]
@wconstab
Copy link
Copy Markdown
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot Bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 28, 2026
@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

pytorchmergebot pushed a commit that referenced this pull request Jan 28, 2026
kapilsh pushed a commit to kapilsh/pytorch that referenced this pull request Feb 2, 2026
@github-actions github-actions Bot deleted the gh/wconstab/508/head branch February 28, 2026 02:21
sandy-gags pushed a commit to sandy-gags/pytorch that referenced this pull request Mar 12, 2026
Inplace ops for dtensor have a restriction: you're not allowed to
redistribute the 'inplace' tensor.  This means in some cases, sharding
propagation has to fail becuase the inplace input is not compatible with
any of the possible sharding strategies.

This PR makes sure this case raises the expected informative error
rather than a confusing error about selecting min cost over an
empty sharding strategies list.

ghstack-source-id: 76bf5f7
Pull Request resolved: pytorch/pytorch#173572
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: distributed (dtensor) release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants