Skip to content

[Caffe2] Refactoring MIOpen activation ops#13187

Closed
ashishfarmer wants to merge 10 commits intopytorch:masterfrom
ROCm:af/mio_act_refactor
Closed

[Caffe2] Refactoring MIOpen activation ops#13187
ashishfarmer wants to merge 10 commits intopytorch:masterfrom
ROCm:af/mio_act_refactor

Conversation

@ashishfarmer
Copy link
Copy Markdown

This pull request contains changes for:

  1. Adding a generalized MIOpen activation class to be used by activation operators
  2. Refactoring MIOpen ReLU op to use the new class
  3. Adding ELU, Tanh and Sigmoid MIOpen ops

cc: @bddppq

@ashishfarmer
Copy link
Copy Markdown
Author

ashishfarmer commented Oct 26, 2018

@bddppq bddppq added the module: rocm AMD GPU support for Pytorch label Oct 26, 2018
@bddppq bddppq self-requested a review October 26, 2018 21:42
Copy link
Copy Markdown
Contributor

@bddppq bddppq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work. Could you add some tests in caffe2/python/operator_tests?

@ashishfarmer
Copy link
Copy Markdown
Author

The activation_ops_test.py and hyperbolic_ops_test.py already has tests for the changes in this PR. Because of the cudnn aliasing, we did not need to update anything in the python tests

Copy link
Copy Markdown
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bddppq has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@bddppq bddppq self-requested a review October 27, 2018 06:37
facebook-github-bot pushed a commit that referenced this pull request Oct 27, 2018
Summary:
This pull request contains changes for:
1. Adding a generalized MIOpen activation class to be used by activation operators
2. Refactoring MIOpen ReLU op to use the new class
3. Adding ELU, Tanh and Sigmoid MIOpen ops

Differential Revision: D12810112

Pulled By: bddppq

fbshipit-source-id: 9519b3a0cd733b906bcba5d8948be089029c43ac
@bddppq
Copy link
Copy Markdown
Contributor

bddppq commented Oct 27, 2018

Landed

@bddppq bddppq closed this Oct 27, 2018
@ashishfarmer ashishfarmer deleted the af/mio_act_refactor branch October 29, 2018 19:21
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
This pull request contains changes for:
1. Adding a generalized MIOpen activation class to be used by activation operators
2. Refactoring MIOpen ReLU op to use the new class
3. Adding ELU, Tanh and Sigmoid MIOpen ops

Differential Revision: D12810112

Pulled By: bddppq

fbshipit-source-id: 9519b3a0cd733b906bcba5d8948be089029c43ac
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: rocm AMD GPU support for Pytorch open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants