Replace rounding_mode="true" with rounding_mode=None#51988
Replace rounding_mode="true" with rounding_mode=None#51988peterbell10 wants to merge 13 commits intogh/peterbell10/50/basefrom
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit f8a5a4a (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
ghstack-source-id: b8c37fe Pull Request resolved: pytorch#51988
[ghstack-poisoned]
mruberry
left a comment
There was a problem hiding this comment.
Awesome! Thanks for the quick fix, @peterbell10!
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
Differential Revision: [D26375070](https://our.internmc.facebook.com/intern/diff/D26375070) [ghstack-poisoned]
|
@mruberry PTAL. Rebased, test are passing, and sent PR updating XLA pytorch/xla#2859. |
| a = torch.tensor([1.], requires_grad=True) | ||
| out = torch.div(a, 2., rounding_mode="trunc") | ||
| self.assertEqual(out.grad_fn._saved_rounding_mode, "trunc") # std::string -> str | ||
| a = torch.ones(1, 1, requires_grad=True) |
There was a problem hiding this comment.
Why change this to be a linalg.qr test?
There was a problem hiding this comment.
Because it doesn't have a str argument any more. See #55225 where I add it back.
mruberry
left a comment
There was a problem hiding this comment.
I just have one small question. This is awesome, @peterbell10! Thank you for picking up the initiatives on following-through with these deprecations.
@JackCaoG let me know if/when you're ready to merge the XLA side of this, then we can coordinate a merge like we typically do.
|
Imported to hit internal tests. There are already some uses of rounding_mode within FB! But I don't think any were using the undocumented "true" mode. |
|
@mruberry Peter submitted pytorch/xla#2859 and it is ready now. I will merge xla pr once tis one is merged. |
|
@JackCaoG Awesome! Beginning the land process now. I'll update here when it finishes. |
Summary: Pull Request resolved: pytorch#51988 * **pytorch#51988 Replace rounding_mode="true" with rounding_mode=None** Test Plan: Imported from OSS Reviewed By: ngimel Differential Revision: D27561817 Pulled By: mruberry fbshipit-source-id: 60d1d9c389570f60d599fc1876518717367fb368
Stack from ghstack:
Differential Revision: D27561817
BC-breaking note
torch.dividewithrounding_mode='true'is replaced withrounding_mode=NoneThe
torch.divide's undocumentedrounding_mode='true'option has been removed, and insteadrounding_mode=Nonecan be passed to indicate no rounding should take place. This is equivalent to omitting the argument entirely.Example:
Version 1.8.1:
Version 1.9.0: