We will be undergoing planned maintenance on January 16th, 2026 at 1:00pm UTC. Please make sure to save your work.

What it does

Sci-Fi is no longer just fiction, introducing the Jarvis Lite, where you can turn any wall into your very own interactive touchscreen display! Using a projector and two Pi-cameras, the device uses stereoscopic vision to find the position of your hands and the projected screen. Using certain hand gestures (like pointing, pinching, and thumbs up), you are able to interact with the screen! Not only can you turn any wall into a touchscreen, but Jarvis Lite can also turn any wall into an entire game world. By tracking a black ball that can be thrown at the screen, we can translate the ball's trajectory into games. Jarvis Lite can be used for drawing, teaching, gaming, and browsing.

How we built it

The Jarvis Lite is built with two Raspberry-Pi-cam-3's connected to a Raspberry Pi 5 that is acting as a web UTC server, assembled with a modular 3D-printed mount. With two cameras, we can triangulate the position of anything. Uses OpenCV MediaPipe to track hand gestures and uses AruCo markers to calibrate the distance from the screen. Combining these two technologies, we can interact with the projected screen with our hands, from clicking, holding down, and dragging, to even scrolling. Also uses a grayscale mask to isolate and track a ball.

Challenges we ran into

  • communication between piCams and computer
  • consistent hand tracking(had to create custom gestures)
  • adjusting the sensitivity of inputs
  • implementing effective ball tracking

Accomplishments that we're proud of

Being able to actually draw and navigate the computer with Jarvis Lite. Making a functional game and translating balls thrown in game. Autocalibration with AruCo markers

What we learned

MediaPipe Hand Tracking MediaMTX for RPI Camera feed streaming at 60fps over wifi

What's next for Jarvis Lite

better tracking, lower latency, simpler and integrated design.

Built With

Share this project:

Updates