quantized layer norm: add to static quant#36690
quantized layer norm: add to static quant#36690vkuzo wants to merge 3 commits intogh/vkuzo/35/basefrom
Conversation
Summary: Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
💊 Build failures summary and remediationsAs of commit f3af14b (more details on the Dr. CI page):
XLA failureJob pytorch_xla_linux_bionic_py3_6_clang9_build is failing. Please create an issue with title prefixed by This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 7 times. |
Summary: Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 5370ccb Pull Request resolved: #36690
Summary: Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D21055401](https://our.internmc.facebook.com/intern/diff/D21055401) [ghstack-poisoned]
Summary: Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 4a7194e Pull Request resolved: #36690
|
|
||
| checkQuantized(model) | ||
|
|
||
| def test_normalization(self): |
There was a problem hiding this comment.
This test (and other tests here) only test for FBGEMM, needs to be fixed in a later PR.
| self.assertEqual(quant_ref.int_repr().numpy(), qy.int_repr().numpy(), | ||
| message="BatchNorm3d module API failed") | ||
|
|
||
| def test_layer_norm(self): |
There was a problem hiding this comment.
Can we test this over qnnpack and FBGEMM? Qnnpack only tests the x86 path.
@given(qengine=st.sampled_from(("qnnpack", "fbgemm")))and using qengine as an argument to the test should do it.
There was a problem hiding this comment.
we don't have a mobile implementation for LayerNorm. It might run correctly on mobile but it would be too slow to be useful. Should we still test for it? I'm flexible, more just curios on how we've done it in the past.
raghuramank100
left a comment
There was a problem hiding this comment.
Requesting extending test to qnnpack, looks good otherwise!
|
This pull request has been merged in 2c558db. |
Summary: Pull Request resolved: pytorch#36690 Adds the static quantization hook for LayerNorm Test Plan: ``` python test/quantization/test_quantized_nn_mods.py ModuleAPITest.test_layer_norm python test/quantization/test_quantization.py EagerModePostTrainingQuantTest.test_normalization ``` Imported from OSS Differential Revision: D21055401 fbshipit-source-id: 188329f35359576d50ed0db5fb675ce68c28bf7d
Stack from ghstack:
Summary:
Adds the static quantization hook for LayerNorm
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D21055401