Skip to content

[quant][graph] Add quantized batch_norm2d_relu to graph mode#36552

Closed
supriyar wants to merge 10 commits intogh/supriyar/86/basefrom
gh/supriyar/86/head
Closed

[quant][graph] Add quantized batch_norm2d_relu to graph mode#36552
supriyar wants to merge 10 commits intogh/supriyar/86/basefrom
gh/supriyar/86/head

Conversation

@supriyar
Copy link
Copy Markdown
Contributor

@supriyar supriyar commented Apr 14, 2020

Stack from ghstack:

Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D21075253

Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@supriyar supriyar requested a review from apaszke as a code owner April 14, 2020 01:52
supriyar added a commit that referenced this pull request Apr 14, 2020
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: c3fea27
Pull Request resolved: #36552
@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Apr 14, 2020
@dr-ci
Copy link
Copy Markdown

dr-ci Bot commented Apr 14, 2020

💊 Build failures summary and remediations

As of commit d28b1f9 (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)

Extra GitHub checks


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

See how this bot performed.

This comment has been revised 25 times.

Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
supriyar added a commit that referenced this pull request Apr 14, 2020
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: 57d2074
Pull Request resolved: #36552
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
supriyar added a commit that referenced this pull request Apr 14, 2020
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: 0320ab6
Pull Request resolved: #36552
Comment thread test/quantization/test_quantize_script.py Outdated
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Comment thread test/quantization/test_quantize_script.py Outdated
Comment thread test/quantization/test_quantize_script.py Outdated
Comment thread test/quantization/test_quantize_script.py
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Comment thread test/quantization/test_quantize_script.py Outdated
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
for individual ops end to end.
"""
def _test_op_impl(self, SingleOpModule, data, quantized_op):
def _test_op_impl(self, SingleOpModule, data, quantized_op, **kwargs):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you change this to take a module instance? then you can initialize the module before passing into this function.

Copy link
Copy Markdown
Contributor

@jerryzh168 jerryzh168 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please refactor _test_op_impl

Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@supriyar supriyar requested a review from jerryzh168 April 16, 2020 00:01
Comment on lines +1436 to +1438
for Model in [BNRelu, BNFuncRelu, BNFuncInplaceRelu]:
for inplace in [True, False]:
model = self._test_op_impl(Model(inplace), data, "quantized::batch_norm2d_relu")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please change this loop as well, we don't need to loop through [True, False] anymore

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch! Updated it

Summary:
Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@supriyar supriyar requested a review from jerryzh168 April 16, 2020 00:49
Copy link
Copy Markdown
Contributor

@jerryzh168 jerryzh168 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks!

@facebook-github-bot
Copy link
Copy Markdown
Contributor

This pull request has been merged in 17c268b.

@facebook-github-bot facebook-github-bot deleted the gh/supriyar/86/head branch April 20, 2020 14:17
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
…#36552)

Summary:
Pull Request resolved: pytorch#36552

Do the fusion for inplace and non-inplace relu
Tested for functional relu as well.
Functional batch_norm is not a usual use-case (since it expects the weight, bias, mean, var) so that is not tested.

Test Plan:
test_quantize_script.py test_batch_norm2d_relu

Imported from OSS

Differential Revision: D21075253

fbshipit-source-id: 0a07ea477cab19abf1d1b0856e623b1436240da1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants