Inspiration
Polysound XR is an immersive mixed-reality experience where every gesture transforms into music and visuals. With intuitive hand tracking, you become the conductor, shaping soundscapes in real-time as your movements create personalized compositions with musical notes and visual effects. Every movement shapes sound, turning the space around you into a vibrant, interactive musical canvas. Whether you're creating intricate melodies or exploring new rhythms, Polysound XR empowers music lovers of all levels to connect with sound in a whole new way. Conduct your world, shape your sound, with Polysound XR.
What it does
It transforms gestures into music and visuals, allowing users to compose and perform in a mixed-reality environment.
The experience allows you to compose a melody in real time with your hands. In the prototype you place notes on a circular canvas utilizing multiple instruments (4 in prototype). There is an option to play and pause your composed song along with resetting the canvas.
We have included a narrated onboarding sequence, which explains the functionality to the users in a concise and engaging way. You first compose a well known melody in the tutorial, getting to know all the hand gesture controls. Afterwards you are free to explore creating your own musical compositions.
See our video demo and apk in the Try It Out section!
How we built it
- We combined hand-tracking technology, real-time audio, and visual effects in Unity, creating a fully functional musical composition experience.
- The project was built completely in Unity. All functionality was implemented using C# scripting and Open XR hand tracking modules along with the Meta SDK.
- The prototype was constructed to work as a standalone multi-instrument step sequencer.
- Control interface was optimised for flowing motion and ease of use.
- Sounds used for the instruments were hand-crafted in the Analog Lab software instrument in Ableton Live (music production software). Sounds were chosen to work well with each other (including timbre, width, position in the auditory spectrum and pitch)
Challenges we ran into
- Balancing performance with visual fidelity and refining gesture recognition to allow for smooth and visually pleasing control
- Designing an intuitive interface that allows users to move fluidly in space, while keeping the controls concise to not overwhelm them
- Processing variable sound sequences with multiple pitches and instruments in a fully automated and scalable way.
- Constructing an onboarding procedure that walks the user through available controls in an interactive way, while maintaining a strong story and gamification elements
- Mapping different instruments and note pitches to corresponding visual objects, making the interaction with the interface ergonomically efficient, aesthetically pleasing and intuitive.
Accomplishments that we're proud of
- Creating an intuitive, immersive platform that bridges music and visuals making composing music accessible to users of all skill levels.
- Constructing a prototype that allows users to fully leverage their creativity and showcases our vision in a fully functional way
- Making an onboarding sequence that draws the user into action instantly and provides a story element to the experience.
What we learned
We gained insights into hand-tracking, gesture-based interactions, spatial UI design, sound sequencing in Unity and optimizing for mixed-reality platforms.
What's next for Polysound XR
Adding multiplayer collaboration, expanding instrument and custom sound libraries, and enhancing AR/VR compatibility, offering richer, more immersive music creation experiences.





Log in or sign up for Devpost to join the conversation.