fix bug in onnx div operator when input[1] is variable;#21203
fix bug in onnx div operator when input[1] is variable;#21203hujunyao wants to merge 1 commit intoopencv:4.xfrom
Conversation
rogday
left a comment
There was a problem hiding this comment.
Thank you for contribution!
Please, consider adding tests(test_onnx_importer.cpp) and test data(opencv_extra/testdata/dnn/onnx/). You can use opencv_extra/testdata/dnn/onnx/generate_onnx_models.py as a reference.
| float coeff; | ||
| if ( isDiv ) | ||
| { | ||
| if ( constId == 0 ) | ||
| { | ||
| layerParams.set("power", -1); | ||
| coeff = blob_value; | ||
| } | ||
| else | ||
| coeff = 1.0 / blob_value; | ||
| } | ||
| else | ||
| coeff = blob_value; |
There was a problem hiding this comment.
We should always initialize variables to avoid certain cases of bugs.
Power layer is calculating (x*scale + shift)^power, so, in your case(const/tensor), it will be 1/(x*scale) instead of scale/x. We could use something like:
float coeff = blob_value;
if ( isDiv )
{
coeff = 1.0 / blob_value;
if ( constId == 0 )
{
layerParams.set("power", -1);
}
}But I'm not sure it will be accurate enough.
There was a problem hiding this comment.
consider input 2/x , x/2 , I think it should be :
float coeff = blob_value;
if ( isDiv )
{
if ( constId == 0 )
{
layerParams.set("power", -1);
}
else
coeff = 1.0 / blob_value;
}
There was a problem hiding this comment.
@hujunyao, I just tested your code using the following snippet, which you can use to add tests(it needs input/output saving, you can look at save_onnx_data_and_model in opencv_extra/testdata/dnn/onnx/generate_onnx_models.py):
node = onnx.helper.make_node(
'Div',
inputs=['x', 'y'],
outputs=['z'],
)
x = np.array([2]).astype(np.float32)
y = np.array([[4, 4], [4, 4]]).astype(np.float32)
z = x / y
X = onnx.helper.make_tensor('x', onnx.TensorProto.FLOAT, x.shape, x)
Y = onnx.helper.make_tensor_value_info('y', onnx.TensorProto.FLOAT, y.shape)
Z = onnx.helper.make_tensor_value_info('z', onnx.TensorProto.FLOAT, z.shape)
graph = onnx.helper.make_graph([node], 'const_div', [Y], [Z], initializer=[X])
model = onnx.helper.make_model(graph, producer_name='const_div')
onnx.save(model, 'const_div.onnx')It outputs a mat of 0.125, which is consistent with my prediction: 1/(2*4), whereas the proposed solution outputs 0.5 as expected.
There was a problem hiding this comment.
I thought it won't go to this branch when it in
'''
if (constId != -1 && haveVariables)
{
..................
}
else if (!haveVariables)
'''
I thought this branch only calculate two operator : a constant and a variable, when 2 constants div , it will go to !haveVariable. but I'm sure that.
|
Closed in favor of #21297 |
fix the onnx DIV operator bugs.
when input[1] is variable , it should be calculated as X^-1 .
the simple code to show bugs:
C++ reproducer: