Inspiration

  • Several of our team members have taught children with severe learning difficulties.
  • During online lessons, they often struggled to concentrate.
  • We wanted to create a tool that makes learning more interactive and fun, helping improve focus and engagement.

What it does

-Uses computer vision to detect hand positions and gestures in real time. -Allows the user to climb around the screen as an alien character. -Interprets specific gestures to trigger different actions and movements on the screen.

How we built it

  • For hand gesture detection, we used the MediaPipe gesture recognition machine learning model.
  • We used overlay_lib to overlay the hand gestures and alien character.
  • We used vector maps to determine and scale the character’s position on the screen.

Challenges we ran into

  • Optimisation issues when displaying overlays on the screen.
  • Difficulties achieving transparent and smooth overlay rendering.

Accomplishments that we're proud of

  • Successfully working with a machine learning model to detect different hand gestures and translate them into interactive functionalities.
  • Optimising processing time for smoother, real-time performance.

What we learned

  • How to work with MediaPipe gesture recognition.
  • How to use overlay_lib for creating interactive overlays.

What's next for Mime Climb

  • Expanding gesture recognition functionality.
  • Adding more interactive and educational gameplay features.

Built With

Share this project:

Updates