Skip to content

StereoDepth node uses "spec" values not calibration values -- unexpected and significant depth differences #358

@diablodale

Description

@diablodale

From what I can discern, the StereoDepth node does not use the calibration data stored on the device. It instead uses spec values.
This is unexpected since the factory did a device-specific calibration. And the user may have done their own calibration. Yet StereoDepth does not use these calibrated values.

Setup

  • Microsoft Windows [Version 10.0.19044.1469]
  • VS2019 v16.11.9
  • depthai-core main

Repro

Discovered while writing my app and comparing depth values calculated by StereoDepth on device -vs- by disparity on host.

// setup
monos.setResolution(THE_400_P)
stereoDepth->setSubpixel(true);
stereoDepth->setExtendedDisparity(false);
stereoDepth->initialConfig.setConfidenceThreshold(255);
stereoDepth->initialConfig.setLeftRightCheck(false);
// then setup xouts for both stereoDepth.disparity and stereoDepth.depth

// Data direct from EEPROM calibration on my OAK-D
const float baseline_mm = 10.0f * std::get<dai::CalibrationHandler>(calibration).getBaselineDistance(dai::CameraBoardSocket::RIGHT, dai::CameraBoardSocket::LEFT, false);
// baseline_mm = 73.0895758
const float focalX = std::get<dai::CalibrationHandler>(calibration).getCameraIntrinsics(dai::CameraBoardSocket::RIGHT, disparity.cols, disparity.rows)[0][0];
// focalX = 428.242584

// Data by spec
const auto baselinespec_mm = 10.0f * std::get<dai::CalibrationHandler>(calibration).getBaselineDistance();
// baselinespec_mm = 75.0
const auto fovspec = std::get<dai::CalibrationHandler>(calibration).getFov(dai::CameraBoardSocket::RIGHT, true);
// fovspec = 71.86
const float focalCalc = disparity.cols / (2.0f * std::tanf(DEGREES_TO_RADIANS(fovspec / 2.0f)));
// focalCalc = 441.575653

// my OAK-D, fixed position, therefore has fixed disparity and depth at specific pixel. Values from same seqNum are...
uint16_t subpixel_disparity = 170;  // value direct from stereoDepth.disparity feed at pixel 200, 320
uint16_t depth = 1559; // value direct from stereoDepth.depth feed at pixel 200, 320

Result

Do host-side math on subpixel_disparity to generate a depth in mm. It exactly matches the StereoDepth calculated value only if spec values are used. This is 8.6cm delta from calibrated result.

// spec_depth = 1558.502
float spec_depth = 75.0 * 441.575653 * 8.0 / 170

// calibration_depth = 1472.944
float calibration_depth = 73.0895758 * 428.242584 * 8.0 / 170

Expected

By default, I expect StereoDepth to use the calibrated values stored in the EEPROM.
Optionally, to have an API StereoDepth::setDepthBySpec(bool) which enables use of spec values instead.

Workaround

Do depth on the host.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions