Inspiration

Our team was inspired by our journeys in learning musical instruments; regardless of instrument, the importance of practicing cannot be understated-- even a day without practice can offset one's progress greatly. When on-the-go or traveling, however, it's not feasible to bring along instruments, especially large ones (like the piano!) While existing programs like GarageBand allow you to compose different melodies, they falter as a method of practicing, since you can't exercise technique. We aimed to address this issue by creating a tool that leverages ordinary objects carried by people (a laptop and a piece of paper) to balance practicality and functionality.

What it does

The program takes the user's laptop webcam as an input. The user must align a piano template page (a piece of paper with piano keys printed on it) with guiding lines on the camera view interface. The program then tracks the user's fingertips and detects when they touch the keys, which have predefined locations based on the page alignment, and plays the appropriate note. There's also sheet music displayed on the side for coaching use.

How we built it

We built this project using MediaPipe, JavaScript, and React, splitting the development into two teams: backend and frontend. The backend team utilized JavaScript and MediaPipe to access the user's webcam, detect the positions of the fingertips. Each fingertip had a node drawn on it using a React canvas element for visualization purposes. Similarly, nodes were placed onto the individual keys to detect when they were pressed. The frontend team handled the user interface, camera view, and sheet music display, using React and JavaScript.

Challenges we ran into

We ran into challenges with the template page alignment system. Initially, we had opted to use OpenCV to automatically detect the edges of the sheet. Though this approach would make for a smoother setup process, we found that its success was heavily dependent on conditions such as lighting and shadows, so we opted for a manual alignment process instead, given the time constraints amidst other debugging. Additionally, detecting collisions between fingertips and the piano keys presented some issues in that certain pairs of keys would detect presses when only one was pressed. We addressed this by adding a collision threshold factor to reduce the overlap.

Accomplishments that we're proud of

We're proud of the way we time-boxed our troubleshooting in order to deliver an MVP. We faced a lot of situations where we realized we couldn't implement the exact features we wanted to; instead of spending extra time troubleshooting, we were able to deliver on the core functionality with room for expansion later.

What we learned

This project let us explore React in the context of image-processing, which is something our team members have not had much experience with, so we found this to be an interesting exercise.

What's next for PaperKeys

We'd like to lean into the coaching aspect of the application, implementing a dynamic sheet music system that provides the user feedback on which exact notes they play incorrectly; this may even let us incorporate a chatbot system for personalized, human-like feedback.

Built With

Share this project:

Updates