[DTensor] Disallow redistribution to mixed partial types#172609
[DTensor] Disallow redistribution to mixed partial types#172609wconstab wants to merge 6 commits intogh/wconstab/503/basefrom
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/172609
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (3 Unrelated Failures)As of commit c4a0c7c with merge base 4ac0624 ( FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
| partial_reduce_ops: set[str] = set() | ||
| for p in placements: | ||
| if p.is_partial(): | ||
| partial_reduce_ops.add(p.reduce_op) # type: ignore[union-attr] |
There was a problem hiding this comment.
Not sure if this make code simpler. We can compare Partial object directly.
partial_ops = set([p for p in placement if p.is_partial()])
zpcore
left a comment
There was a problem hiding this comment.
LGTM!
Curious, do we still plan to support mixed Partial in the future?
|
No, can't imagine a use case, so unless someone finds a compelling use case let's simplify our lives. |
ghstack-source-id: b13b323 Pull Request resolved: pytorch/pytorch#172609
| # We do not see a valid use case for mixing different partial types in the same DTensor. | ||
| # in principle it could be supported, but since nonlinear reductions (e.g. max) exist, relative ordering | ||
| # of different partials would become semantically critical. Without a motivating use case, we prohibit this. | ||
| _assert_no_mixed_partial_types(current_spec.placements) | ||
| _assert_no_mixed_partial_types(target_spec.placements) |
There was a problem hiding this comment.
Is it true that there are several places where mixed partial could show up?
- distribute_tensor placements arg (not banned)
- DTensor.from_local placements arg (not banned)
- sharding propagation (banned here if
needs_redistribute; o/w not banned although I can't think of a real case where it happens from any op) - DTensor.redistribute (banned here)
From a completeness perspective, do you think it makes sense to make _assert_no_mixed_partial_types into utils file, and call it from case 1 and case 2?
There was a problem hiding this comment.
yea, I was planning to add (1) and (2) in a later PR and update docs accordingly, but if folks are in agreement about this I can just include it in this PR. And I agree about (3), it is not strictly banned but it seems practically impossible to happen.
[ghstack-poisoned]
[ghstack-poisoned]
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 2 jobs have failed, first few of them are: trunk / linux-jammy-rocm-py3.10 / test (distributed, 1, 3, linux.rocm.gpu.gfx942.4), trunk / linux-jammy-rocm-py3.10 / test (distributed, 2, 3, linux.rocm.gpu.gfx942.4) Details for Dev Infra teamRaised by workflow job |
[ghstack-poisoned]
|
@pytorchbot merge -i |
Merge failedReason: 1 jobs have failed, first few of them are: Check Labels / Check labels Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge -i |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…orch#172609)" This reverts commit eba22f9. Reverted pytorch#172609 on behalf of https://github.com/wconstab due to Breaking internal use case https://www.internalfb.com/intern/testinfra/testconsole/testrun/10696049259246146 ([comment](pytorch#172609 (comment)))
) Pull Request resolved: pytorch#172609 Approved by: https://github.com/zpcore, https://github.com/tianyu-l ghstack dependencies: pytorch#172924
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: eb4a927 Pull Request resolved: #173614
…filters mixed partials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 8cca0a6 Pull Request resolved: #173614
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 2984751 Pull Request resolved: #173614
…filters mixed partials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…filters mixed partials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 2483827 Pull Request resolved: #173614
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: e4407b4 Pull Request resolved: #173614
…filters mixed partials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 38e8412 Pull Request resolved: #173614
…filters mixed partials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
…3614) Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. Pull Request resolved: #173614 Approved by: https://github.com/pianpwk, https://github.com/zpcore
…3614) Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. Pull Request resolved: #173614 Approved by: https://github.com/pianpwk, https://github.com/zpcore
…orch#173614) Per pytorch#172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. Pull Request resolved: pytorch#173614 Approved by: https://github.com/pianpwk, https://github.com/zpcore
Stack from ghstack (oldest at bottom):