Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit f32e2e0 (more details on the Dr. CI page):
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
test/test_foreach.py
Outdated
| self._test_pointwise_op(device, dtype, torch._foreach_addcdiv, torch._foreach_addcdiv_, torch.addcdiv) | ||
|
|
||
| @dtypes(*torch.testing.get_all_dtypes(include_bfloat16=False, include_bool=False, include_complex=False)) | ||
| @dtypes(*torch.testing.get_all_dtypes()) |
There was a problem hiding this comment.
What added bfloat16 and bool support? Were those supported all along?
There was a problem hiding this comment.
I've added support for torch.bool and torch.bfloat16 in this PR
test/test_foreach.py
Outdated
| self._test_pointwise_op(device, dtype, torch._foreach_addcdiv, torch._foreach_addcdiv_, torch.addcdiv) | ||
|
|
||
| @dtypes(*torch.testing.get_all_dtypes(include_bfloat16=False, include_bool=False, include_complex=False)) | ||
| @dtypes(*torch.testing.get_all_dtypes()) |
There was a problem hiding this comment.
Is there a type promotion test for foreach_maximum / foreach_minimum, or is that out of the scope of this PR?
There was a problem hiding this comment.
Type promotion will be tested in upcoming PRs once we push different dtype behavior from error to slow path.
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139) ------- - Add support for torch.bool and torch.bfloat16 for foreach max/min - Updated the tests to use OpInfo [ghstack-poisoned]
|
Hi @izdeby! Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention. You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
Stack from ghstack:
Differential Revision: D25674139