Support for add relu functional module#26612
Closed
raghuramank100 wants to merge 4 commits intogh/raghuramank100/32/basefrom
Closed
Support for add relu functional module#26612raghuramank100 wants to merge 4 commits intogh/raghuramank100/32/basefrom
raghuramank100 wants to merge 4 commits intogh/raghuramank100/32/basefrom
Conversation
Add support for add relu functional module, this allows for fusion of add and relu quantized operations Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/) [ghstack-poisoned]
raghuramank100
pushed a commit
that referenced
this pull request
Sep 21, 2019
Add support for add relu functional module, this allows for fusion of add and relu quantized operations Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/) ghstack-source-id: 90556583 Pull Request resolved: #26612
jerryzh168
reviewed
Sep 23, 2019
test/common_quantization.py
Outdated
|
|
||
| @classmethod | ||
| def from_float(cls, mod): | ||
| new_mod = cls(quantized=True) |
Contributor
There was a problem hiding this comment.
why do you need to do this? isn't this done by the flow?
Contributor
Author
There was a problem hiding this comment.
Rewrote the tests to match how we test other modules
jerryzh168
requested changes
Sep 23, 2019
Contributor
jerryzh168
left a comment
There was a problem hiding this comment.
I think there is some problem with original test
Add support for add relu functional module, this allows for fusion of add and relu quantized operations Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/) [ghstack-poisoned]
Add support for add relu functional module, this allows for fusion of add and relu quantized operations Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/) [ghstack-poisoned]
Add support for add relu functional module, this allows for fusion of add and relu quantized operations Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/) [ghstack-poisoned]
raghuramank100
pushed a commit
that referenced
this pull request
Sep 30, 2019
Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Differential Revision: [D17518268](https://our.internmc.facebook.com/intern/diff/D17518268/)
dzhulgakov
approved these changes
Sep 30, 2019
z-a-f
approved these changes
Sep 30, 2019
| def test_functional_module(self, train_mode): | ||
| model = ModelWithFunctionals() | ||
| xq = torch.quantize_per_tensor(torch.rand(10, 1, dtype=torch.float), 0.01, 30, torch.quint8) | ||
| self.checkScriptable(model, [(xq.dequantize(), xq.dequantize())], check_save_load=True) |
There was a problem hiding this comment.
nit: factor out the xq.dequantize() -- it is used in multiple places.
Contributor
|
This pull request has been merged in bdcaf63. |
jamesr66a
pushed a commit
that referenced
this pull request
Oct 3, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
jamesr66a
pushed a commit
that referenced
this pull request
Oct 3, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
jamesr66a
pushed a commit
that referenced
this pull request
Oct 3, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
jamesr66a
pushed a commit
that referenced
this pull request
Oct 4, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
jamesr66a
pushed a commit
that referenced
this pull request
Oct 4, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
soumith
pushed a commit
that referenced
this pull request
Oct 7, 2019
Summary: Pull Request resolved: #26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
pdlive215
pushed a commit
to pdlive215/pytorch
that referenced
this pull request
Nov 27, 2019
Summary: Pull Request resolved: pytorch#26612 Add support for add relu functional module, this allows for fusion of add and relu quantized operations ghstack-source-id: 91055976 Test Plan: buck test caffe2/test:quantization -- 'test_functional_module \(test_quantization\.FunctionalModuleTest\)' --print-passing-details Differential Revision: D17518268 fbshipit-source-id: e1e8b4655d6b32405863ab9d1c7da111fb4343cc
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Stack from ghstack:
Add support for add relu functional module, this allows for fusion of add and relu quantized operations
Differential Revision: D17518268