Skip to content

[quant][graphmode][fx] Add support for quantizing functional linear + {functional relu/module relu}#50975

Closed
jerryzh168 wants to merge 2 commits intogh/jerryzh168/531/basefrom
gh/jerryzh168/531/head
Closed

[quant][graphmode][fx] Add support for quantizing functional linear + {functional relu/module relu}#50975
jerryzh168 wants to merge 2 commits intogh/jerryzh168/531/basefrom
gh/jerryzh168/531/head

Conversation

@jerryzh168
Copy link
Copy Markdown
Contributor

@jerryzh168 jerryzh168 commented Jan 23, 2021

Stack from ghstack:

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D26032532

… {functional relu/module relu}

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Jan 23, 2021

💊 CI failures summary and remediations

As of commit bb2f04a (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

…al linear + {functional relu/module relu}"

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D26032532](https://our.internmc.facebook.com/intern/diff/D26032532)

[ghstack-poisoned]
@codecov
Copy link
Copy Markdown

codecov Bot commented Jan 23, 2021

Codecov Report

Merging #50975 (bb2f04a) into gh/jerryzh168/531/base (789f6f1) will decrease coverage by 0.31%.
The diff coverage is 100.00%.

@@                    Coverage Diff                     @@
##           gh/jerryzh168/531/base   #50975      +/-   ##
==========================================================
- Coverage                   80.97%   80.66%   -0.32%     
==========================================================
  Files                        1919     1919              
  Lines                      209785   209796      +11     
==========================================================
- Hits                       169875   169225     -650     
- Misses                      39910    40571     +661     

if activation_statically_quantized:
# quantize output for statically quantized linear op
root_module = quantizer.modules['']
act_post_process_name = self.relu_node.name if self.relu_node else self.linear_node.name
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if debug is true then do we still insert observer at the output of relu node? relu doesn't need output_scale and zero_point, right?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, we still insert in the output of relu. debug does not affect where observer is inserted, it only affect the quantized model we produce.
relu itself does not need output_scale/output_zero_point, but linear_relu and conv_relu need these.

Copy link
Copy Markdown
Contributor

@supriyar supriyar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for the quick fix! left a small comment.

@supriyar
Copy link
Copy Markdown
Contributor

@jerryzh168, do we need similar bugfix for conv2d + relu fusion as well? Seems like we don't do fusion there either in convert.

@jerryzh168
Copy link
Copy Markdown
Contributor Author

@jerryzh168, do we need similar bugfix for conv2d + relu fusion as well? Seems like we don't do fusion there either in convert.

yes we do, will add that in a later PR

@facebook-github-bot
Copy link
Copy Markdown
Contributor

This pull request has been merged in 28869d5.

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/531/head branch January 29, 2021 15:21
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
… {functional relu/module relu} (pytorch#50975)

Summary: Pull Request resolved: pytorch#50975

Test Plan: Imported from OSS

Reviewed By: supriyar

Differential Revision: D26032532

fbshipit-source-id: a084fb4fd711ad52b2da1c6378cbcc2b352976c6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants