-
Notifications
You must be signed in to change notification settings - Fork 26.5k
Add mish activation function #58648
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add mish activation function #58648
Conversation
|
Hi @Adnios! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
💊 CI failures summary and remediationsAs of commit a5635db (more details on the Dr. CI page):
🕵️ 2 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
|
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
1 similar comment
|
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
|
i simple man, i see structured, i likey (letting @jbschlosser do the actual review) |
jbschlosser
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, thank you for the beautiful and thorough work! Couple small things and this will be good to go :)
- Please add tests for the C++ module / functional forms to
test/cpp/api/modules.cppandtest/cpp/api/functional.cpp. Once built, they can be run withbuild/bin/test_api. - Optionally, adding an
OpInfoentry formish()to theop_dblist intorch/testing/_internal/common_methods_invocations.pywill add a bunch more generic testing. - I don't think it's worth doing for this PR, but our
softplusimplementation accepts abetaparameter; in the future, it might be worth exposing this as configurable formishas well.
jbschlosser
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! One tiny lint-related issue and we're good to go :)
|
@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
@Adnios Could you mark the test with |
Hello @JackCaoG, thanks for your help! I just check the log message of |
Codecov Report
@@ Coverage Diff @@
## master #58648 +/- ##
=======================================
Coverage 76.44% 76.45%
=======================================
Files 1999 1999
Lines 200570 200625 +55
=======================================
+ Hits 153335 153385 +50
- Misses 47235 47240 +5 |
|
@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Signed-off-by: Adnios <2780199647@qq.com>
|
@jbschlosser has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
@jbschlosser merged this pull request in 09a8f22. |
Summary: See issus: pytorch#58375 Pull Request resolved: pytorch#58648 Reviewed By: gchanan Differential Revision: D28625390 Pulled By: jbschlosser fbshipit-source-id: 23ea2eb7d5b3dc89c6809ff6581b90ee742149f4
| const Tensor& input) { | ||
| auto input_tanh_softplus = at::tanh(at::softplus(input)); | ||
| auto input_sigmoid = at::sigmoid(input); | ||
| return grad_output * (input_tanh_softplus + (input * input_sigmoid * (1 - input_tanh_softplus * input_tanh_softplus))); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this create many intermediate copies? could some of these be done inplace?
Summary: See issus: pytorch#58375 Pull Request resolved: pytorch#58648 Reviewed By: gchanan Differential Revision: D28625390 Pulled By: jbschlosser fbshipit-source-id: 23ea2eb7d5b3dc89c6809ff6581b90ee742149f4
See issus: #58375