[quant][graphmode][fx] Add support for quantizing functional linear + {functional relu/module relu}#50975
[quant][graphmode][fx] Add support for quantizing functional linear + {functional relu/module relu}#50975jerryzh168 wants to merge 2 commits intogh/jerryzh168/531/basefrom
Conversation
… {functional relu/module relu}
Summary:
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit bb2f04a (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
…al linear + {functional relu/module relu}"
Summary:
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: [D26032532](https://our.internmc.facebook.com/intern/diff/D26032532)
[ghstack-poisoned]
Codecov Report
@@ Coverage Diff @@
## gh/jerryzh168/531/base #50975 +/- ##
==========================================================
- Coverage 80.97% 80.66% -0.32%
==========================================================
Files 1919 1919
Lines 209785 209796 +11
==========================================================
- Hits 169875 169225 -650
- Misses 39910 40571 +661 |
| if activation_statically_quantized: | ||
| # quantize output for statically quantized linear op | ||
| root_module = quantizer.modules[''] | ||
| act_post_process_name = self.relu_node.name if self.relu_node else self.linear_node.name |
There was a problem hiding this comment.
if debug is true then do we still insert observer at the output of relu node? relu doesn't need output_scale and zero_point, right?
There was a problem hiding this comment.
yes, we still insert in the output of relu. debug does not affect where observer is inserted, it only affect the quantized model we produce.
relu itself does not need output_scale/output_zero_point, but linear_relu and conv_relu need these.
supriyar
left a comment
There was a problem hiding this comment.
thanks for the quick fix! left a small comment.
|
@jerryzh168, do we need similar bugfix for conv2d + relu fusion as well? Seems like we don't do fusion there either in convert. |
yes we do, will add that in a later PR |
|
This pull request has been merged in 28869d5. |
… {functional relu/module relu} (pytorch#50975)
Summary: Pull Request resolved: pytorch#50975
Test Plan: Imported from OSS
Reviewed By: supriyar
Differential Revision: D26032532
fbshipit-source-id: a084fb4fd711ad52b2da1c6378cbcc2b352976c6
Stack from ghstack:
Summary:
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D26032532