Inspiration

We were inspired by the common problem of lecturers needing to make constant trips between presenting and their laptop. We wanted to implement a seamless process by which a presenter can interact with their slides, with some added features as well.

What it does

Our application merges simple, intuitive hand gestures with useful presenting functionality.

How we built it

We used the Mediapipe library for advanced, seamless hand tracking. We leveraged LLMs to enable rapid development, and advanced functionalities.

Challenges we ran into

We faced difficulties with the combination of the Mediapipe library and the opencv webcam library, due to the lack of absolute depth perception from Mediapipe on the skeletal nodes of the hand. This caused difficulty in the implementation of the finger ray-casting laser pointer, although we believe we arrived at an acceptable prototype by the end of the hackathon.

Accomplishments that we're proud of

We are proud to have brought an idea from nothing to a working prototype. We are also proud that there is real-world potential for this kind of project, with some polish and additional features. We are further proud of our collaboration.

What we learned

We enjoyed learning about Mediapipe and tkinter. We also learned about the various facets of Git, particularly the resolution of merge conflicts, and the use of branches.

What's next for PTNM

If we were to continue this project, we could make a number of improvements. Namely, we could extend the range of the hand tracking. We could also attempt to improve the finger ray-casting, and the gesture recognition.

Built With

Share this project:

Updates