[DTensor] expand_to_full_mesh_op_strategy filters mixed partials#173614
[DTensor] expand_to_full_mesh_op_strategy filters mixed partials#173614wconstab wants to merge 6 commits intogh/wconstab/510/basefrom
Conversation
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
This PR needs a
|
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/173614
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 Cancelled Jobs, 4 Unrelated FailuresAs of commit 21537d4 with merge base 4c6817d ( CANCELLED JOBS - The following jobs were cancelled. Please retry:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: eb4a927 Pull Request resolved: #173614
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 8cca0a6 Pull Request resolved: #173614
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 2984751 Pull Request resolved: #173614
| for spec in spec_list: | ||
| if spec is not None: | ||
| partial_reduce_ops = { | ||
| p.reduce_op for p in spec.placements if isinstance(p, Partial) |
There was a problem hiding this comment.
maybe a weird case, but I would be wary of NormPartial() and P(sum) having the same reduce_op attribute
There was a problem hiding this comment.
well i thought we were deleting NormPartial soon? but if not i should harden this.
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 2483827 Pull Request resolved: #173614
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: e4407b4 Pull Request resolved: #173614
…rtials" Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. [ghstack-poisoned]
Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. ghstack-source-id: 38e8412 Pull Request resolved: #173614
|
@pytorchbot merge -i |
Merge startedYour change will be merged while ignoring the following 4 checks: inductor / inductor-cpu-build / build, inductor / unit-test / inductor-halide-build / build, inductor / unit-test / inductor-cpu-core-build (3.12) / build, inductor / inductor-test-cuda13 / test (inductor_torchbench, 2, 2, linux.g5.4xlarge.nvidia.gpu) Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…3614) Per #172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. Pull Request resolved: #173614 Approved by: https://github.com/pianpwk, https://github.com/zpcore
…orch#173614) Per pytorch#172609, we do not allow mixed partial placements in one DTensor spec. It is therefore pointless to generate strategies that encourage mixed partial types. Before this PR, any OP strategy that has a rule for more than one partial type and uses the expand util would have combined them inappropriately. After this PR, the expansion util filters out any such combinations. Pull Request resolved: pytorch#173614 Approved by: https://github.com/pianpwk, https://github.com/zpcore
Stack from ghstack (oldest at bottom):
Per #172609, we do not allow
mixed partial placements in one DTensor spec. It is therefore pointless
to generate strategies that encourage mixed partial types.
Before this PR, any OP strategy that has a rule for more than one
partial type and uses the expand util would have combined them
inappropriately. After this PR, the expansion util filters out any such
combinations.