🏆 Awards:
- Best Overall
- YC Unicorn Prize Interview Selection
Event: Hack the North 2025
Date: September 2025
Lattice is a real-time holographic projection framework that captures 3D volumetric data using Xbox Kinect depth cameras and renders fully aligned, live 3D reconstructions for HoloLens telepresence visualization.
The system enables users to experience immersive holographic presence — turning multi-view depth captures into coherent, real-world 3D scenes streamed instantly between physical and mixed-reality environments.
Lattice utilizes three Xbox Kinect v2 cameras, each calibrated with intrinsic parameters for accurate RGB-D mapping. The cameras capture synchronized frames containing both depth (D) and color (RGB) data streams.
- Each frame is converted into a point cloud via per-pixel projection using the camera’s internal pinhole model:
P = K^-1 * [u, v, 1]^T * D(u,v)
where
Kis the intrinsic matrix, andD(u,v)is the depth value. - The corresponding color value is assigned per point for photorealistic rendering.
Captured point clouds are serialized and streamed asynchronously to a central server using WebSocket connections.
- Each packet includes: timestamp, camera ID, RGB-D data, and transformation metadata.
- A custom binary protocol minimizes overhead and preserves spatial fidelity.
The server performs multi-sensor spatial calibration using:
- Iterative Closest Point (ICP) to compute rigid body transformations aligning each camera’s frame to a common global coordinate system.
- Convex hull enclosure to maintain consistent spatial boundaries and prevent ghosting between overlapping clouds.
Real-time transforms are applied to every incoming point cloud frame to ensure geometric alignment across views.
To maintain smooth motion capture and reduce jitter:
- Frames are timestamp-synchronized within ±5 ms tolerance.
- Overlapping regions are merged through spatial averaging and bilateral filtering, removing sensor noise while preserving edges.
The final unified 3D scene is transmitted to the HoloLens client, where the user can view the live reconstructed hologram in full 3D space.
- Rendering uses Unity3D with GPU-accelerated shaders for point-based rendering.
- The framework supports both local rendering and remote streaming modes for telepresence applications.
| Component | Technology |
|---|---|
| Capture | Xbox Kinect SDK, OpenCV, C++ |
| Networking | WebSocket (Boost Asio), Protobuf serialization |
| Server Fusion | C++, PCL (Point Cloud Library), Eigen |
| Visualization | Unity3D, HoloLens SDK, C# |
| Infrastructure | UDP transport layer, asynchronous threading, timestamp synchronization |
- Multi-Kinect RGB-D fusion with real-time calibration
- Low-latency network streaming with timestamp alignment
- Robust 3D point cloud merging using ICP
- Live holographic rendering on Microsoft HoloLens