Fix implicit conversion from array to scalar in python bindings#15915
Fix implicit conversion from array to scalar in python bindings#15915alalek merged 10 commits intoopencv:3.4from
Conversation
|
@VadimLevin The patch introduce 1000+ compile warnings. Please take a look: https://pullrequest.opencv.org/buildbot/builders/precommit_linux64/builds/23624/steps/compile%20release/logs/warnings%20%281490%29 |
|
@alalek Can you have a look to this change? |
alalek
left a comment
There was a problem hiding this comment.
Bindings changes look good to me.
Test looks big, but it is still fast (~0.05 sec). Also I would note that this problem is not related to norm() only, so more general checks can be added later.
Large part of these Python tests does not test accuracy in all flavors - these python tests should test bindings generator at first.
| "int": ArgTypeInfo("int", FormatStrings.int, "0", True), | ||
| "float": ArgTypeInfo("float", FormatStrings.float, "0.f", True), | ||
| "double": ArgTypeInfo("double", FormatStrings.double, "0", True), | ||
| "c_string": ArgTypeInfo("char*", FormatStrings.string, '(char*)""') |
|
This change may break users code, so we need to ensure that impact would be low. So we need tests. We can start with bindings arguments handling/conversions. Initial part of tests is here: alalek@pr15915_r |
- Introduce ArgTypeInfo namedtuple instead of plain tuple.
If strict conversion parameter for type is set to true, it is
handled like object argument in PyArg_ParseTupleAndKeywords and
converted to concrete type with the appropriate pyopencv_to function
call.
- Remove deadcode and unused variables.
- Fix implicit conversion from numpy array with 1 element to scalar
- Fix narrowing conversion to size_t type.
ce183ba to
68c9757
Compare
- Introduce ArgTypeInfo namedtuple instead of plain tuple.
If strict conversion parameter for type is set to true, it is
handled like object argument in PyArg_ParseTupleAndKeywords and
converted to concrete type with the appropriate pyopencv_to function
call.
- Remove deadcode and unused variables.
- Fix implicit conversion from numpy array with 1 element to scalar
- Fix narrowing conversion to size_t type.·
- Enable tests with wrong conversion behavior
- Restrict passing None as value
- Restrict bool to integer/floating types conversion
68c9757 to
9ea4b7c
Compare
modules/python/src2/cv2.cpp
Outdated
| int _val = PyObject_IsTrue(obj); | ||
| if(_val < 0) | ||
| } | ||
| if (obj == Py_None) |
There was a problem hiding this comment.
Currently "None" may be used for parameters with default values.
Lets try to keep this behavior.
However, it is not Pythonic and should be deprecated/removed (it is not good for "strict" overloads selector).
Perhaps:
- we can extent
ArgInfowith "hasDefaultValue" field.
if (obj == Py_None && info.hasDefaultValue)
{
return true; // keep default value from declaration
}
else
{
failmsg...
}
There was a problem hiding this comment.
I've got your point and agree with it. I will add default value check for None as a separate PR.
modules/python/src2/cv2.cpp
Outdated
|
|
||
| // Python definition alias to prevent define collisions | ||
| #if defined(PY_LONG_LONG) | ||
| #define CV_HAVE_LONG_LONG 1 |
There was a problem hiding this comment.
Thanks for catch it up. I've changed check for appropriate type. To be removed.
- Remove unused macro
- Add better conversion for types to numpy types descriptors
- Add argument name to fail messages
- NoneType treated as a valid argument. Better handling will be added
as a standalone patch
2ebdaae to
795a597
Compare
93a5b92 to
aff0cfe
Compare
- If signed integer is positive it can be safely converted
to unsigned
- Add check for plain python 2 objects
- Add check for numpy scalars
- Add simple type_traits implementation for better code style
67c1dbd to
fe7fdb4
Compare
- Move type_traits to separate header
fe7fdb4 to
abb0129
Compare
| // According to the numpy documentation: | ||
| // There are 21 statically-defined PyArray_Descr objects for the built-in data-types | ||
| PyArray_Descr* to = getNumpyTypeDescriptor<T>(); | ||
| if (canBeSafelyCasted<T>(obj, to)) |
There was a problem hiding this comment.
canBeSafelyCasted<T>
This looks over complicated. Code doesn't use all 21 types, because it is OpenCV code (I see only 3 specializations for parseNumpyScalar - size_t, float, double).
I suggest to remove "stdx" replacement and just write these specializations (or overloads).
There was a problem hiding this comment.
The underlying idea of this generalized approach is to reduce code repetition and have a set well tested functions as building blocks. Basically, it is possible to cover all primitive types by 3 general functions:
- For signed integral types;
- For unsigned integral types;
- For floating point types.
And 1 specialized function for size_t.
After primitive all primitive types are covered by this set of functions, it is possible to cover other cases with their combination.
For instance, vector<type> can be parsed as a sequence without fixed size, cv::Size can be parsed as a sequence with size 2. To parse a single element generalized function should be called.
If a new type (e.g. int64_t can not be parsed now) will be added, it automatically be covered by signed integral parser, so the contributor should provide only tests for it.
These functions are not introduced in the current PR, but will be in further.
If you are against this idea, I'm OK to rewrite this part with parsers specializations.
There was a problem hiding this comment.
-
Vector conversions is another story. I still didn't get problems with them.
If a new type
Set of types for bindings should be limited to cover other than PYthon languages too.
Lets keep patch as much simple as possible.
There was a problem hiding this comment.
@alalek
Added some test to illustrate current problem with sequential types parsing (such as vector or Size (sequence of size 2)):
VadimLevin@07bdc4a
For vector<int> skipped test either fails with some inner error or perform narrowing conversion (double -> int) which we want to eliminate with strong type check.
For cv::Size it simply rejects everything except tuple.
So the main goal of the underlying idea, that I tried to describe in the previous comment, is elimination of confusing and non-intuitive conversion behavior.
Implementation with templates may be a bit complicated, but in my opinion has more advantages rather disadvantages.
|
@alalek @VadimLevin Do you have any conclusion on the patch? |
- Made canBeSafelyCasted specialized only for size_t, so
type_traits header became unused and was removed.
- Added clarification about descriptor pointer
|
@alalek Please take a look. Vadim reduced the patch as discussed offline. |
This pullrequest changes
numpy array was implicitly converted by
PyArg_ParseTupleAndKeywordsformat string. After removal possible conflicting types fromsimple_argtype_mappingpyopencv_toparsers for them are generated, where additional checks are done, so correct overload is chosen.resolves #11426