Skip to content

Update foreach max/min#49714

Closed
izdeby wants to merge 32 commits intogh/izdeby/74/basefrom
gh/izdeby/74/head
Closed

Update foreach max/min#49714
izdeby wants to merge 32 commits intogh/izdeby/74/basefrom
gh/izdeby/74/head

Conversation

@izdeby
Copy link
Copy Markdown
Contributor

@izdeby izdeby commented Dec 21, 2020

Stack from ghstack:

Differential Revision: D25674139


  • Add support for torch.bool and torch.bfloat16 for foreach max/min
  • Updated the tests to use OpInfo

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Dec 21, 2020

💊 CI failures summary and remediations

As of commit f32e2e0 (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-scanned failure(s)

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

self._test_pointwise_op(device, dtype, torch._foreach_addcdiv, torch._foreach_addcdiv_, torch.addcdiv)

@dtypes(*torch.testing.get_all_dtypes(include_bfloat16=False, include_bool=False, include_complex=False))
@dtypes(*torch.testing.get_all_dtypes())
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What added bfloat16 and bool support? Were those supported all along?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added support for torch.bool and torch.bfloat16 in this PR

self._test_pointwise_op(device, dtype, torch._foreach_addcdiv, torch._foreach_addcdiv_, torch.addcdiv)

@dtypes(*torch.testing.get_all_dtypes(include_bfloat16=False, include_bool=False, include_complex=False))
@dtypes(*torch.testing.get_all_dtypes())
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a type promotion test for foreach_maximum / foreach_minimum, or is that out of the scope of this PR?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type promotion will be tested in upcoming PRs once we push different dtype behavior from error to slow path.

izdeby and others added 3 commits January 12, 2021 09:02
@izdeby izdeby mentioned this pull request Jan 26, 2021
@izdeby izdeby changed the title Update foreach max/min Update complex check for foreach max/min Jan 26, 2021
@izdeby izdeby changed the title Update complex check for foreach max/min Update foreach max/min Jan 26, 2021
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min

[ghstack-poisoned]
@izdeby izdeby requested review from gchanan, ngimel and zou3519 January 27, 2021 15:27
Iurii Zdebskyi added 2 commits January 27, 2021 09:23
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Iurii Zdebskyi and others added 17 commits February 18, 2021 12:15
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
Differential Revision: [D25674139](https://our.internmc.facebook.com/intern/diff/D25674139)

-------
- Add support for torch.bool and torch.bfloat16 for foreach max/min
- Updated the tests to use OpInfo

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

Hi @izdeby!

Thank you for your pull request.

We require contributors to sign our Contributor License Agreement, and yours needs attention.

You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

@ngimel ngimel removed their request for review May 30, 2021 23:44
@zou3519 zou3519 removed their request for review June 28, 2021 13:31
@github-actions github-actions bot closed this May 12, 2022
@facebook-github-bot facebook-github-bot deleted the gh/izdeby/74/head branch June 11, 2022 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants