Use torch.set_default_dtype in test_data_parallel and rename dtype2prec#32962
Use torch.set_default_dtype in test_data_parallel and rename dtype2prec#32962pritamdamania87 wants to merge 2 commits intogh/pritamdamania87/94/basefrom
Conversation
As per @gchanan's comments on #30445, I've used `torch.set_default_dtype` in test_data_parallel instead of specifying dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/) [ghstack-poisoned]
As per @gchanan's comments on #30445, I've used `torch.set_default_dtype` in test_data_parallel instead of specifying dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/) ghstack-source-id: 97676959 Pull Request resolved: #32962
|
|
||
| dtype2prec = {torch.float: 1e-5, | ||
| # Using @precisionOverride is the recommended way instead of this. | ||
| dtype2prec_DONTUSE = {torch.float: 1e-5, |
|
|
||
|
|
||
| dtype2prec = {torch.float: 1e-5, | ||
| # Using @precisionOverride is the recommended way instead of this. |
There was a problem hiding this comment.
maybe add a bit more context: using precisionOverride specific to your test is the recommended way of doing this. These are just some values that worked for test_nn.
gchanan
left a comment
There was a problem hiding this comment.
address the comments then looks good to go.
…me dtype2prec" As per @gchanan's comments on #30445, I've used `torch.set_default_dtype` in test_data_parallel instead of specifying dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/) [ghstack-poisoned]
Pull Request resolved: #32962 As per @gchanan's comments on #30445, I've used `torch.set_default_dtype` in test_data_parallel instead of specifying dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE ghstack-source-id: 98388429 Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/)
💊 CircleCI build failures summary and remediationsAs of commit e095808: None of the build failures appear to be your fault.
Detailed failure analysisOne may explore the probable reasons each build failed interactively on the Dr. CI website. 🚧 3 upstream failures recognized by patterns:These builds matched patterns, but were probably caused by upstream breakages:
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 2 times. |
|
This pull request has been merged in fd684cc. |
…ec (pytorch#32962) Summary: Pull Request resolved: pytorch#32962 As per gchanan's comments on pytorch#30445, I've used `torch.set_default_dtype` in test_data_parallel instead of specifying dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE ghstack-source-id: 98388429 Test Plan: waitforbuildbot Differential Revision: D19714374 fbshipit-source-id: eb55bbca33881625636ba9ea6dd4cb692f25668e
Stack from ghstack:
As per @gchanan's comments on
#30445, I've used
torch.set_default_dtypein test_data_parallel instead of specifyingdtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE
Differential Revision: D19714374