Inspiration

This project was inspired by the HoloRay Motion-Tracked Annotation Challenge, which highlights a key limitation in collaborative medical platforms: annotations drawn on live medical video remain static while the camera and anatomy move. In procedures like laparoscopy, this can cause annotations to lose context and reduce clarity.

What it does

Our goal was to build a system where annotations stay visually attached to the anatomy they describe, even during motion.

How we built it

The system begins with a user-selected region in the video. Feature points are tracked using Lucas–Kanade optical flow, and frame-to-frame motion is estimated with RANSAC. To reduce drift and recover from tracking loss, we periodically validate the tracker using template matching.

Challenges we ran into

Fast camera motion, occlusion, and deformation caused early versions to drift or jitter. While the system is not perfect, it performs significantly better than our initial attempts and remains stable in most real-world scenarios. Building this solution made us excited to continue improving it, explore more advanced tracking techniques, and deepen our understanding of computer vision for medical applications.

Accomplishments that we're proud of

Built a fully functional real-time motion-tracked annotation system from scratch using classical computer vision.

What we learned

We gained hands-on experience applying classical computer vision techniques in a real-time setting. Through experimentation, we learned how optical flow, robust motion estimation, template matching, and confidence scoring complement each other, and why combining multiple signals leads to more reliable tracking than using a single method alone.

What's next for Holoray Challenge

Built With

Share this project:

Updates