adds quantized implementation of hard sigmoid#34607
adds quantized implementation of hard sigmoid#34607vkuzo wants to merge 5 commits intogh/vkuzo/7/basefrom
Conversation
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: e19f098 Pull Request resolved: #34607
💊 CircleCI build failures summary and remediationsAs of commit 5fee38c (more details on the Dr. CI page): ✅ None of the build failures appear to be your fault 💚
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 44 times. |
|
LGTM, do we have some perf numbers as well for this? |
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: d476af7 Pull Request resolved: #34607
sure thing, here is the output of the current benchmark: https://our.internmc.facebook.com/intern/paste/P127415136/ I will follow up offline on the best way to benchmark all of these things on mobile! |
supriyar
left a comment
There was a problem hiding this comment.
Thanks for the benchmarks! I usually also compare against float to see the speedup.
But looks good overall
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 864c47a Pull Request resolved: #34607
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D20480546](https://our.internmc.facebook.com/intern/diff/D20480546) [ghstack-poisoned]
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 8a3fbbf Pull Request resolved: #34607
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D20480546](https://our.internmc.facebook.com/intern/diff/D20480546) [ghstack-poisoned]
Summary: Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 6ddd123 Pull Request resolved: #34607
|
This pull request has been merged in 58c5b6d. |
|
Looks like this PR breaks master: e.g.: https://app.circleci.com/pipelines/github/pytorch/pytorch/141312/workflows/9036f70f-a41d-4968-aedc-4cab6b4307f0/jobs/4849783 |
|
@vkuzo I am reverting this PR to fix test signals on master. Please resubmit with a fix. Thanks! |
sorry about that, will do! |
Summary: Adds quantized implementation of hardsigmoid. Original PR was #34607 and had to be reverted for a test breakage, trying again. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Adds quantized implementation of hardsigmoid. Original PR was #34607 and had to be reverted for a test breakage, trying again. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D20514212](https://our.internmc.facebook.com/intern/diff/D20514212) [ghstack-poisoned]
Summary: Adds quantized implementation of hardsigmoid. Original PR was #34607 and had to be reverted for a test breakage, trying again. Test Plan: tests benchmarks Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D20514212](https://our.internmc.facebook.com/intern/diff/D20514212) [ghstack-poisoned]
Summary: Pull Request resolved: #34959 Adds quantized implementation of hardsigmoid. Original PR was #34607 and had to be reverted for a test breakage, trying again. Test Plan: tests benchmarks Imported from OSS Differential Revision: D20514212 fbshipit-source-id: cc7ae3b67757e2dde5c313c05ce60a0f2625d961
Summary: Pull Request resolved: pytorch#34607 Adds quantized version of hardsigmoid activation. Note: not implementing the _ and .out versions is currently intended, because the implementation changes the scale and zp and it's nice to not allow the user to specify scale and zp. Lmk if we should handle this differently. Test Plan: tests benchmarks Imported from OSS Differential Revision: D20480546 fbshipit-source-id: 9febcb44afd920125ed2ca4900492f0b712078ea
Summary: Pull Request resolved: pytorch#34959 Adds quantized implementation of hardsigmoid. Original PR was pytorch#34607 and had to be reverted for a test breakage, trying again. Test Plan: tests benchmarks Imported from OSS Differential Revision: D20514212 fbshipit-source-id: cc7ae3b67757e2dde5c313c05ce60a0f2625d961
Stack from ghstack:
Summary:
Adds quantized version of hardsigmoid activation.
Note: not implementing the _ and .out versions is
currently intended, because the implementation changes the scale and
zp and it's nice to not allow the user to specify scale
and zp. Lmk if we should handle this differently.
Test Plan:
tests
benchmarks
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D20480546