Inspiration

The inspiration behind HoloStream stemmed from how traditional replays often fail to capture the depth and spatial awareness of real time sports plays. In fast and technical games like football, basketball, or Formula 1, it can be difficult for viewers to understand positioning, movement, and timing from the usual and often boring flat screen angles.

While brainstorming, we remembered the holographic creatures from Spy Kids 2 and thought it would be cool to have a similar interface for watching real life sports. How cool would it be if we could have a holographic interface for enjoying sports at home like they do in sci-fi?

That led us to the idea of creating a holographic sports visualizer that would allow fans to view plays as if they were in the stands. Watch your favorite moments frozen in time, extracted using real data from games, and look at visuals of potential plays, making it an exciting learning experience for those unfamiliar with the rules.

But beyond entertainment, we believe HoloSteam is important because sports are a means of expression for so many individuals. We want to make sure that the hardworking athletes who dedicate their lives to these games get the recognition and hype they deserve.

What We Learned

We learned A LOT during this project, including:

  • How to create a fully cross-compatible react-native application using a variety of packages requiring different configurations for different devices.
  • Pose estimation for a video using a variety of machine learning models.
  • Turning 2D videos into 3D simulations using EasyMocap and computer vision
  • Monocular motion capture from complex videos.
  • Turning the extracted pose data into usable, clean animations and fitting them onto a custom-generated human figure.

We also deepened our understanding of the entire full stack pipeline for AR/XR content.

How We Built It

We used Blender to model players, the field, and dynamic play animations. Each play was designed as a 3D animation sequence that mirrors real-time player movement, ball trajectories, and other interactions.

The client side application was built using React Native, that would allow us to target both iOS and Android with a single codebase. We integrated Three.js to display plays as interactive holograms.

Challenges We Faced

  • Polygon counts on 3D models
  • Frame rates for non-critical animations
  • Resolving complex dependencies with EasyMocap, SMPL, Ffmpeg, and CUDA to run the model.
  • Using detected 3D SMPL data to create Bio-Vision Heirarchies and apply them to a human model.
Share this project:

Updates