[device_mesh] Implement _unflatten on top of CuTe layout bookkeeping#161224
Closed
fduwjj wants to merge 22 commits intogh/fduwjj/185/basefrom
Closed
[device_mesh] Implement _unflatten on top of CuTe layout bookkeeping#161224fduwjj wants to merge 22 commits intogh/fduwjj/185/basefrom
_unflatten on top of CuTe layout bookkeeping#161224fduwjj wants to merge 22 commits intogh/fduwjj/185/basefrom
Conversation
[ghstack-poisoned]
This was referenced Aug 21, 2025
Closed
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/161224
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 10d5870 with merge base ffc9559 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
wconstab
reviewed
Aug 21, 2025
wconstab
reviewed
Aug 21, 2025
_unflatten on top of CuTe layout bookkeeping
…ayout bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
…ayout bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
…ayout bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
…ayout bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
…ayout bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
_unflatten on top of CuTe layout bookkeeping_unflatten on top of CuTe layout bookkeeping
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Aug 29, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack.
fegin
reviewed
Aug 29, 2025
ezyang
reviewed
Sep 2, 2025
… bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta [ghstack-poisoned]
This was referenced Sep 18, 2025
Closed
fegin
approved these changes
Oct 14, 2025
Contributor
fegin
left a comment
There was a problem hiding this comment.
LGTM, some test suggestions.
Comment on lines
+1170
to
+1175
| backend_override: Optional[ | ||
| dict[ | ||
| Union[int, str], | ||
| Union[str, C10dBackend.Options, tuple[str, C10dBackend.Options]], | ||
| ] | ||
| ] = None, |
Contributor
There was a problem hiding this comment.
Do we still want to support key being "int" option as this API already make mesh_dim_names mandatory?
Contributor
Author
There was a problem hiding this comment.
Good point, yeah let's make it str only.
… bookkeeping" cc H-Huang awgu wanchaol fegin wz337 wconstab d4l3k pragupta ezyang msaroufim dcci [ghstack-poisoned]
Contributor
Author
|
@pytorchbot merge |
Collaborator
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
zhudada0120
pushed a commit
to zhudada0120/pytorch
that referenced
this pull request
Oct 15, 2025
pytorch#161224) Pull Request resolved: pytorch#161224 Approved by: https://github.com/lw, https://github.com/fegin ghstack dependencies: pytorch#164510
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Oct 15, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack.
Chao1Han
pushed a commit
to Chao1Han/pytorch
that referenced
this pull request
Oct 21, 2025
pytorch#161224) Pull Request resolved: pytorch#161224 Approved by: https://github.com/lw, https://github.com/fegin ghstack dependencies: pytorch#164510
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Oct 28, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Nov 3, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Nov 4, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Nov 7, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Nov 18, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 8, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 9, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 9, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 9, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 11, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 15, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
fegin
added a commit
to pytorch/torchtitan
that referenced
this pull request
Dec 16, 2025
This is a demonstration of how parallel_dims will be when using pytorch/pytorch#161224 stack. ghstack-source-id: d29d2e2 Pull-Request: #1885
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Stack from ghstack (oldest at bottom):
_unflattenon top of CuTe layout bookkeeping #161224cc @H-Huang @awgu @wanchaol @fegin @wz337 @wconstab @d4l3k @pragupta @ezyang @msaroufim @dcci