Skip to content

Honor existing dim_param in shape inference#3896

Merged
askhade merged 5 commits intoonnx:masterfrom
shinh:honor-existing-dim-param
Dec 15, 2021
Merged

Honor existing dim_param in shape inference#3896
askhade merged 5 commits intoonnx:masterfrom
shinh:honor-existing-dim-param

Conversation

@shinh
Copy link
Copy Markdown
Contributor

@shinh shinh commented Dec 15, 2021

Description

Before this change, existing dim_param was overwritten by generated symbols like unk__. NonZero is used to test the behavior.

Motivation and Context

We'd like to annotate output values of ops which produce unknown dims (e.g., NonZero) by pre-setting dim_params of such values. However, current ONNX overwrites them.

Reproduce script:

import onnx
import onnx.shape_inference
import torch


class NonZero(torch.nn.Module):
    def forward(self, x):
        return torch.nonzero(x)


x = torch.rand(7)
torch.onnx.export(NonZero(), x, "nonzero.onnx",
                  output_names=["y"],
                  dynamic_axes={"y": {0: "NZ"}})
model = onnx.load("nonzero.onnx")
model = onnx.shape_inference.infer_shapes(model)
print(model.graph.output)  # "NZ" is overwritten by "unk__0"

Previous behavior:

inferred\existing dim_value dim_param N/A
dim_value error inferred inferred
dim_param existing inferred inferred
unk__ existing inferred inferred

This PR:

inferred\existing dim_value dim_param N/A
dim_value error inferred inferred
dim_param existing existing (1) inferred
unk__ existing existing (2) inferred

I guess no one would object the change for (2) in the above table, which is the reason why I created this PR. I'm not sure about (1). To me, it is more natural to check if the existing dim_param is equal to the inferred one like what we do for dim_value, but it'd break backward compatibility. What do you think?

@shinh shinh requested a review from a team as a code owner December 15, 2021 06:09
@shinh shinh changed the title Honor existing dim_param Honor existing dim_param in shape inference Dec 15, 2021
Before this change, existing dim_param was overwritten by generated
symbols like `unk__`. `NonZero` is used to test the behavior.

Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>
@shinh shinh force-pushed the honor-existing-dim-param branch from 21dbcfa to c774bf9 Compare December 15, 2021 06:14
Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>
Comment thread onnx/test/symbolic_shape_test.py Outdated
make_tensor_value_info("E", TensorProto.FLOAT, (2, -1))])
# the symbolic shape of E and output should be the same
assert self._get_shape_from_name(inferred_model, 'E') == self._get_shape_from_name(inferred_model, 'output')
make_tensor_value_info("E", TensorProto.FLOAT, (2, -1)),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it would be better to change line 100 above and replace 'M' by None? And leave the current check?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ditto. Done!

Comment thread onnx/test/symbolic_shape_test.py Outdated
self._assert_valueinfo_shape(inferred_model, [make_tensor_value_info("C", TensorProto.FLOAT, (2, -1))])
# the symbolic shape of C and output should be the same
assert self._get_shape_from_name(inferred_model, 'C') == self._get_shape_from_name(inferred_model, 'output')
self._assert_valueinfo_shape(inferred_model, [
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we change line 79, to specify output shape as [2, None] or [None, None], and leave the current check as-is?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, nice. Done!

auto* existingDim = existingType->mutable_shape()->mutable_dim(i);
if (!existingDim->has_dim_value()) {
if ((!existingDim->has_dim_value() && !existingDim->has_dim_param()) ||
inferredDim.has_dim_value()) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

condition

(!existingDim->has_dim_value() && !existingDim->has_dim_param()

can be folded in for loop on line 179. This way when existing shape is empty we dont have to run through 2 for loops (1 on line 179 and 1 on line 184)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this what you meant? be56060

Note I think we still need (!existingDim->has_dim_value() && !existingDim->has_dim_param() here for the case where existingType has a known rank (i.e., existingType->has_shape() is true) with unknown dimensions.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes thanks for adding line 175. And yes, I agree we still need this check here for the case where existingType has a known rank.

shinh and others added 3 commits December 16, 2021 05:42
Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>
Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>
@askhade askhade merged commit e207261 into onnx:master Dec 15, 2021
askhade added a commit that referenced this pull request Jan 19, 2022
#3902)

* Fix typos (#3894)

Signed-off-by: Lewis Tunstall <lewis.c.tunstall@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Honor existing dim_param in shape inference (#3896)

* Honor existing dim_param

Before this change, existing dim_param was overwritten by generated
symbols like `unk__`. `NonZero` is used to test the behavior.

Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>

* Fix test expectation of symbolic_shape_test.py

Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>

* Keep the original check as suggested in review

Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>

* Simplify the logic of mergeShapesAndTypes

Signed-off-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>

Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* for issue 3849 to confirm that type check is performed during checker.check_model call

Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* fix path to InferenceError

Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* replace whitelist by safelist (#3900)

Signed-off-by: Alexandre Eichenberger <alexe@us.ibm.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* add checker test case for input type mismatch conflict

Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* remove old numpy in ipynb (#3916)

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Shapeinf for functions (#3722)

* add shape inference for model local functions

Signed-off-by: Ashwini Khade <askhade@microsoft.com>

* Plus tests and more changes

Signed-off-by: Ashwini Khade <askhade@microsoft.com>

* fix typo

Signed-off-by: Ashwini Khade <askhade@microsoft.com>

* Plus updates

Signed-off-by: Ashwini Khade <askhade@microsoft.com>

* Plus updates per review

Signed-off-by: Ashwini Khade <askhade@microsoft.com>

* plus updates

Signed-off-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Fix old ConvTranspose shape inference and softmax upgrader (#3893)

* Fixed ConvTranspose-1 shape inference

Brings change from #3188 (for ConvTranspose-11) into old shape inference
Signed-off-by: Matteo Salvarezza <matteo.salvarezza@gmail.com>

* Fix softmax adapter

Signed-off-by: Matteo Salvarezza <matteo.salvarezza@gmail.com>

Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Fix Linux i686 Release CI failure due to the latest NumPy (#3918)

* 3.8 and 3.9 use numpy 1.21.5

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 1.21.4

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 6be8011a073feeca08d256ac64335a19fc8eee4c6312668b6781ace71db0de20

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 2021-12-19-a4d7f5a

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 1.21.5

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 1.16.6

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* 1.16.6

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* do not check generated

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* remove unrelated numpy

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* if

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* remove generation

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* name

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* add back

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* for testing

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* no 1.16.6

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Remind release manager to remove old onnx-weekly packages after release (#3923)

* remind release manager to remove old onnx-weekly packages

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* add steps

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Simplify function definition of context-dependent functions (#3882)

* Simplify function definition of context-dependent functions

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* Add missing parenthesis

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* Fix errors in function defs

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* Eliminate unused variable

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

* Add int64 type specifier to literal

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>

Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Migration to using main branch (#3925)

* add warning announce

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>

* Rename to main branch globally

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Fix the bug of shape in docs (#3927)

* fix the bug of shape

Signed-off-by: namasikanam <namasikanam@gmail.com>

* fix the bug of shape

Signed-off-by: namasikanam <namasikanam@gmail.com>

Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Append dim even both dim value and param are not set (#3828)

* Append dim even both dim value and param are not set

Signed-off-by: Joe <joe@preferred.jp>

* Copy dim

Signed-off-by: Joe <joe@preferred.jp>

* Revert "Copy dim"

This reverts commit 6623505.

Signed-off-by: Joe <joe@preferred.jp>

* Simplify code

Signed-off-by: Joe <joe@preferred.jp>

Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* bump to 10.15 in azp (#3941)

Signed-off-by: Chun-Wei Chen <jacky82226@gmail.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Add explanation to run gtest (#3933)

Signed-off-by: Joe <joe@preferred.jp>

Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

* Update TreeEnsembleClassifier and TreeEnsembleRegressor to support tensor as attributes (#3897)

* update TreeEnsembleClassifier and TreeEnsembleRegressor

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* change the type of another attribute

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* add missing file

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* update documentation

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* eol

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* add field with _as_tensor

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* fix error messages

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* fix error message

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

* fix missing change

Signed-off-by: xavier dupré <xavier.dupre@gmail.com>

Co-authored-by: xavier dupré <xavier.dupre@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Liqun Fu <liqfu@microsoft.com>

Co-authored-by: lewtun <lewis.c.tunstall@gmail.com>
Co-authored-by: Shinichiro Hamaji <shinichiro.hamaji@gmail.com>
Co-authored-by: Ashwini Khade <askhade@microsoft.com>
Co-authored-by: Alexandre Eichenberger <alexe@us.ibm.com>
Co-authored-by: Chun-Wei Chen <jacky82226@gmail.com>
Co-authored-by: Matteo Salvarezza <matteo.salvarezza@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Co-authored-by: Xingyu Xie <namasikanam@gmail.com>
Co-authored-by: Joe <joe@preferred.jp>
Co-authored-by: Xavier Dupré <xadupre@users.noreply.github.com>
Co-authored-by: xavier dupré <xavier.dupre@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants