Skip to content

[quant][pt1] Uninitialize the accumulation buffer to save some overhead#27005

Closed
jianyuh wants to merge 2 commits intogh/jianyuh/32/basefrom
gh/jianyuh/32/head
Closed

[quant][pt1] Uninitialize the accumulation buffer to save some overhead#27005
jianyuh wants to merge 2 commits intogh/jianyuh/32/basefrom
gh/jianyuh/32/head

Conversation

@jianyuh
Copy link
Member

@jianyuh jianyuh commented Sep 27, 2019

Stack from ghstack:

Similar to #27002, we want to save some overhead.

Differential Revision: D17641819

Similar to #27002, we want to save some overhead.

Differential Revision: [D17641819](https://our.internmc.facebook.com/intern/diff/D17641819/)

[ghstack-poisoned]
@pytorchbot pytorchbot added module: operators oncall: quantization Quantization support in PyTorch labels Sep 27, 2019
jianyuh added a commit that referenced this pull request Sep 27, 2019
Similar to #27002, we want to save some overhead.

Differential Revision: [D17641819](https://our.internmc.facebook.com/intern/diff/D17641819/)

ghstack-source-id: 90963482
Pull Request resolved: #27005
@jianyuh jianyuh added this to the 1.3 milestone Sep 28, 2019
Copy link
Collaborator

@jamesr66a jamesr66a left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's an error, please see CI

@jianyuh
Copy link
Member Author

jianyuh commented Sep 28, 2019

Right: It shows an error as the follows:

Sep 28 00:00:00 RuntimeError: It is currently not supported to specify a dtype that doesn't match the input tensor's dtype via empty_like.  Specified: int Input tensor's dtype: c10::quint8 (empty_like at **/var/lib/jenkins/workspace/aten/src/ATen/native/TensorFactories.cpp:220)**

I think empty_like still didn't support the input tensor's dtype as the quantized uint8/int8 (kQUInt8/kQInt8). cc @jerryzh168 .

@jianyuh jianyuh requested a review from jerryzh168 September 28, 2019 02:43
@dskhudia
Copy link
Contributor

Does empty_zeros call something like memset?

@jianyuh jianyuh removed this from the 1.3 milestone Sep 29, 2019
@jerryzh168
Copy link
Contributor

…some overhead"

Similar to #27002, we want to save some overhead.

Differential Revision: [D17641819](https://our.internmc.facebook.com/intern/diff/D17641819/)

[ghstack-poisoned]
jianyuh added a commit that referenced this pull request Sep 30, 2019
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Differential Revision: [D17641819](https://our.internmc.facebook.com/intern/diff/D17641819/)
@jianyuh
Copy link
Member Author

jianyuh commented Sep 30, 2019

the error is raised from: https://bddppq.github.io/codebrowser/pytorch/pytorch/aten/src/ATen/native/TensorFactories.cpp.html#217

OK, it looks like at::empty_like didn't support the case where input tensor's dtype as the quantized uint8/int8 (kQUInt8/kQInt8) and the options have a different type.

I now switch to at::empty instead.

Copy link
Collaborator

@jamesr66a jamesr66a left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

zdevito pushed a commit to zdevito/ATen that referenced this pull request Oct 1, 2019
Summary:
Pull Request resolved: pytorch/pytorch#27005

Similar to pytorch/pytorch#27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 6a09676.

jamesr66a pushed a commit that referenced this pull request Oct 3, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
jamesr66a pushed a commit that referenced this pull request Oct 3, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
jamesr66a pushed a commit that referenced this pull request Oct 3, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
jamesr66a pushed a commit that referenced this pull request Oct 4, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
jamesr66a pushed a commit that referenced this pull request Oct 4, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
soumith pushed a commit that referenced this pull request Oct 7, 2019
Summary:
Pull Request resolved: #27005

Similar to #27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
@facebook-github-bot facebook-github-bot deleted the gh/jianyuh/32/head branch October 28, 2019 22:16
pdlive215 pushed a commit to pdlive215/pytorch that referenced this pull request Nov 27, 2019
…7005)

Summary:
Pull Request resolved: pytorch#27005

Similar to pytorch#27002, we want to save some overhead.
ghstack-source-id: 91046563

Test Plan: CI

Differential Revision: D17641819

fbshipit-source-id: 9320919242a48f48532035e61d9844de671d39af
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants