Skip to content

python calls to fisheye calibration fail with exception (matrix dimension errors) #5534

@Algomorph

Description

@Algomorph

The following call succeeds in python:

err1, K1, d1, rvecs, tvecs = cv2.calibrateCamera(objpoints, limgpoints, 
                                                       frame_dims, identity, blank, flags=flags,criteria = criteria)

The following call with identical objpoints and limgpoints parameters fails:

err, K1, d1, tvecs, rvecs = cv2.fisheye.calibrate(objectPoints = objpoints, imagePoints = limgpoints, 
                                                          image_size = frame_dims, K=identity, D=blank2)

Firstly, objpoints has to be resized in python from (<num points in set>,3) to (<num points in set>, 1, 3) to pass the objectPoints.type() == CV_64FC3 check. That is not a bug but rather something counter-intuitive for the python wrapper specifically (this doesn't have to be done for the cv2.calibrateCamera, so current online examples don't do this).

Secondly, after the type of object points is correct, the follwing exception gets thrown:

cv2.error: ..../opencv/modules/core/src/matmul.cpp:900: error: (-215) a_size.width == len in function gemm

I have determined it occurs on (current) line 1337 in fisheye.cpp cv::internal::InitExtrinsics. Relevant excerpt:

calcCovarMatrix(objectPoints, covObjectPoints, objectPointsMean, COVAR_NORMAL | COVAR_COLS);

Mat T = -R * objectPointsMean; //assertion failed here due to wrong dimensions of objectPointsMean

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions