Inspiration
As high school friends who met through various of our school's jazz program, music has always been an integral part our lives. We wanted to combine our passions in composition/production with our recent interest in motion in capture. This is how the idea of MusicMotion was born.
What it does
Our python-based app has a full interface to take a user from the beginning, to the end of the creative song process. Users have basic control over song tempo and key type (major or minor); and can then begin an 8 bar repeated layering process with access to 4 unique instruments.
How we built it
Utilizing mediapipe hand features, we were able to map hand positioning to control the various musical interfaces within our app.
Challenges we ran into
Balancing the functionality and complexity of the app served as a challenge because music composition/production is infinitely complicated. When you are playing a real instrument, you have full control over every sound and output generated - but motion capture is limited to only so many gestures. Due to the fact that there are only so many hand input types, we had to choose which parameters users were able to control and prioritize musical features that were most necessary.
What's next for MusicMotion
Scaling the complexity of the app by adding control to new parameters while preventing ui clutter and input complications, is both essential and immediate. We hope to continue developing this software as we are truly passionate about this topic and have already had lots of laughs making new tunes.

Log in or sign up for Devpost to join the conversation.