Skip to content

[BE] remove old pytorch version warning on strided sharding since 2.5 is official released#665

Merged
XilunWu merged 2 commits intomainfrom
gh/XilunWu/8/head
Oct 30, 2024
Merged

[BE] remove old pytorch version warning on strided sharding since 2.5 is official released#665
XilunWu merged 2 commits intomainfrom
gh/XilunWu/8/head

Conversation

@XilunWu
Copy link
Contributor

@XilunWu XilunWu commented Oct 30, 2024

Stack from ghstack (oldest at bottom):

#507 added a PyTorch version check when users try to use FSDP+TP, to make sure the right PT version includes DTensor strided sharding which assures correct DTensor checkpoint. Since PyTorch 2.5 is official released and strided sharding is included in 2.5, we can safely remove this warning.

XilunWu added a commit that referenced this pull request Oct 30, 2024
… is official released

ghstack-source-id: 36338d9
Pull Request resolved: #665
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 30, 2024
…g since 2.5 is official released"


#507 added a PyTorch version check when users try to use FSDP+TP, to make sure the right PT version includes DTensor strided sharding which assures correct DTensor checkpoint. Since PyTorch 2.5 is official released and strided sharding is included in 2.5, we can safely remove this warning.



[ghstack-poisoned]
XilunWu added a commit that referenced this pull request Oct 30, 2024
… is official released

ghstack-source-id: 748ae8e
Pull Request resolved: #665
@XilunWu XilunWu requested review from fegin, tianyu-l and wz337 October 30, 2024 21:48
@XilunWu XilunWu added the better engineering Repo code quality improvements label Oct 30, 2024
@XilunWu XilunWu changed the base branch from gh/XilunWu/8/base to main October 30, 2024 22:06
@XilunWu XilunWu merged commit 2a785e9 into main Oct 30, 2024
mori360 pushed a commit to mori360/torchtitan that referenced this pull request Nov 26, 2024
… is official released (pytorch#665)

Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at
bottom):
* __->__ pytorch#665

pytorch#507 added a PyTorch version check when users try to use FSDP+TP, to
make sure the right PT version includes DTensor strided sharding which
assures correct DTensor checkpoint. Since PyTorch 2.5 is official
released and strided sharding is included in 2.5, we can safely remove
this warning.
xrsrke pushed a commit to NousResearch/torchtitan that referenced this pull request Feb 13, 2026
… is official released (pytorch#665)

Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at
bottom):
* __->__ pytorch#665

pytorch#507 added a PyTorch version check when users try to use FSDP+TP, to
make sure the right PT version includes DTensor strided sharding which
assures correct DTensor checkpoint. Since PyTorch 2.5 is official
released and strided sharding is included in 2.5, we can safely remove
this warning.
xrsrke pushed a commit to NousResearch/torchtitan that referenced this pull request Feb 25, 2026
… is official released (pytorch#665)

Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at
bottom):
* __->__ pytorch#665

pytorch#507 added a PyTorch version check when users try to use FSDP+TP, to
make sure the right PT version includes DTensor strided sharding which
assures correct DTensor checkpoint. Since PyTorch 2.5 is official
released and strided sharding is included in 2.5, we can safely remove
this warning.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

better engineering Repo code quality improvements CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants