Skip to content

Only make a shallow copy when loading optimizer state_dict#106082

Closed
janeyx99 wants to merge 6 commits intogh/janeyx99/78/basefrom
gh/janeyx99/78/head
Closed

Only make a shallow copy when loading optimizer state_dict#106082
janeyx99 wants to merge 6 commits intogh/janeyx99/78/basefrom
gh/janeyx99/78/head

Conversation

@janeyx99
Copy link
Copy Markdown
Contributor

@janeyx99 janeyx99 commented Jul 26, 2023

The thing we do still deep copy is the param_groups, which is much lighter weight. This should also save memory when loading from a checkpoint.

The deepcopy was introduced in ecfcf39, but module.py had only a shallow copy at that point so it did not actually bring parity.

Incorporates an XLA fix, which is why I'm updating the pin to pytorch/xla@ca5eab8

Stack from ghstack (oldest at bottom):

@janeyx99 janeyx99 requested a review from albanD as a code owner July 26, 2023 22:38
@pytorch-bot pytorch-bot Bot added the release notes: optimizer Relating to optimizers, torch.optim label Jul 26, 2023
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Jul 26, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/106082

Note: Links to docs will display an error until the docs builds have been completed.

✅ 2 Unrelated Failures

As of commit 8aea828:

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@janeyx99 janeyx99 added the topic: performance topic category label Jul 26, 2023
janeyx99 added 2 commits July 26, 2023 18:25
This should also save memory when loading from a checkpoint.




[ghstack-poisoned]
This should also save memory when loading from a checkpoint.




[ghstack-poisoned]
Comment thread .git-blame-ignore-revs
Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGTM! Very good catch!

Comment thread .git-blame-ignore-revs
The thing we do still deep copy is the param_groups, which is much lighter weight. This should also save memory when loading from a checkpoint.

The deepcopy was introduced in ecfcf39, but module.py had only a shallow copy at that point so it did not actually bring parity.




[ghstack-poisoned]
janeyx99 added a commit that referenced this pull request Jul 28, 2023
Copy link
Copy Markdown
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok!

The thing we do still deep copy is the param_groups, which is much lighter weight. This should also save memory when loading from a checkpoint.

The deepcopy was introduced in ecfcf39, but module.py had only a shallow copy at that point so it did not actually bring parity.




[ghstack-poisoned]
janeyx99 added a commit that referenced this pull request Jul 31, 2023
@janeyx99
Copy link
Copy Markdown
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot Bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jul 31, 2023
@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge failed

Reason: Command git -C /home/runner/work/pytorch/pytorch cherry-pick -x 46dd0f0a2d97b43b6e730a525d4b178b82f36a6f returned non-zero exit code 1

Auto-merging .git-blame-ignore-revs
CONFLICT (content): Merge conflict in .git-blame-ignore-revs
Auto-merging test/optim/test_optim.py
error: could not apply 46dd0f0a2d9... Only make a shallow copy when loading optimizer state_dict
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git cherry-pick --continue".
hint: You can instead skip this commit with "git cherry-pick --skip".
hint: To abort and get back to the state before "git cherry-pick",
hint: run "git cherry-pick --abort".
Details for Dev Infra team Raised by workflow job

The thing we do still deep copy is the param_groups, which is much lighter weight. This should also save memory when loading from a checkpoint.

The deepcopy was introduced in ecfcf39, but module.py had only a shallow copy at that point so it did not actually bring parity.

Incorporates an XLA fix, which is why I'm updating the pin to pytorch/xla@ca5eab8




[ghstack-poisoned]
janeyx99 added a commit that referenced this pull request Aug 1, 2023
@janeyx99
Copy link
Copy Markdown
Contributor Author

janeyx99 commented Aug 1, 2023

@pytorchbot merge

@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: optimizer Relating to optimizers, torch.optim topic: performance topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants