Allow GradScaler to be pickled#38296
Conversation
💊 CI failures summary and remediationsAs of commit c870c83 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 14 times. |
|
thank you! Generally anything that can't be pickled is a problem for us as we automate the ddp init which uses spawn |
|
@albanD -- would you be willing to review this PR? |
|
Sure I'll take a look tomorrow morning |
ngimel
left a comment
There was a problem hiding this comment.
Move the test please, otherwise lgtm.
| b = pickle.loads(serialized) | ||
| self.assertEqual(a, b) | ||
|
|
||
| def test_pickle_gradscaler(self): |
There was a problem hiding this comment.
can you add it to some of the device-generic tests rather than _TorchMixin? We are trying to get rid of the latter.
There was a problem hiding this comment.
Moved to TestTorchDeviceType and made it more devoutly device agnostic (see comments in code)
albanD
left a comment
There was a problem hiding this comment.
Looks good, same comment on tests as natalia.
| # - Lambdas can't be pickled, so we don't want to supply a lambda as the factory. | ||
| # - Defining READY, UNSCALED, STEPPED and _refresh_per_optimizer_state within GradScaler | ||
| # causes a circular reference, which we'd rather avoid. | ||
| READY = 0 |
There was a problem hiding this comment.
Do we want to make this an enum now that we don't support python 2 anymore? You removed the comment mentioning this.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@albanD is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: Should unblock Lightning-AI/pytorch-lightning#1782. Pull Request resolved: pytorch#38296 Differential Revision: D21553296 Pulled By: albanD fbshipit-source-id: 9041a72d7cf8833e4b01bc767fd2321f17c7c5f2
Should unblock Lightning-AI/pytorch-lightning#1782.