Inspiration
Simulation is the future of learning, and vision is the future of user interfaces. That's why we wanted to create a fun simulator that worked with hand gestures.
What it does
It generate live music and you can conduct the symphony with your hands. Control volume, tempo, rhythm, and leading instruments with your hands gestures. It allows you to practice and simulate your musical skills.
How we built it
We start by testing magenta and understanding that model. Then, we create the ux to recognize hand movements and give feedback to the user. Once we had that in place, we established a socket between backend and frontend, sending the gestures to create the new prompts that transform the music.
Challenges we ran into
- Our first implementation failed and we had to pivot.
- Then controlling the latency between front, back and model to maintain the continuity of the sound.
- Another challenge was creating the different hand gestures to make sense and the grid to Mapping gestures to complex orchestral controls.
Accomplishments that we're proud of
That we were able to create hand gesture in a couple of hours and connect the model to respond to that with very low-lattency.
What we learned
We can now move with more complex simulators using hand gesture. This is also a quick way to learn how to, for example, in the future, manage elements in robotics to control robots remotely.
What's next for Maestro
We need to keep improving the gesture recognition, and give more control over the kind of music. I think if we can connect a model to generate the score at the same time the conductor decide how to create it, that would be amazing.
Log in or sign up for Devpost to join the conversation.