Inspiration

When we were searching to create a team for the hackathon, we wanted teammates that were all guitar players. This is a unique aspect of our identity, that allowed us to bond and create a fun project that we were proud of. We decided to create a game that involved music and guitar playing.

What it does

<Git />ar is a music game that turns real guitar playing into gameplay. It listens to the notes you play through audio analysis and uses computer vision to recognize chord shapes from your finger placements on the fretboard.

The game is built around the theme of identity. Players choose a music genre that reflects who they are, and we use the Gemini API to generate an original song and matching note progression in that style. Every session is personal, because the music is made for you.

<Git />ar blends practice, creativity, and self-expression into an interactive experience where your identity shapes the music you play.

How we built it

The Gemini API generates note progressions and chords based on your chosen genre and style, to build a song for you. The game uses audio analysis and machine learning computer vision to detect and recognize your playing. The Pygame engine makes guitar practice interactive for you.

Challenges we ran into

We were able to use audio analysis libraries to detect individual notes, but it was a more difficult task detecting chords. We were able to get around this by training our own machine learning model, which uses 8-dimensional feature data using a computer vision library. There were additional challenges that we ran into when we thought we may have overfit the model on our own team's playing, but we were able to fix this by testing it with other hackers in the hackathon. Although we have the chord detection working, we were also not able to implement it within the game. We dedicated time to work on music generation using Gemini API and we got it to work to generate notes, and didn't have enough time to fully implement generating chords.

Accomplishments that we're proud of

We used our own designs and sprites in the game. We didn't use generative AI for any of the assets or images, they were all hand-drawn by our team.

We also implemented our own machine learning model to recognize hand shapes and guitar chords.

What we learned

We learned how to train a machine learning model and the challenges that came with creating and validating our own training data. We learned to draw our own sprites and images for the design of our game.

What's next for <Git />ar

Next, we want to improve the accuracy of our audio and vision models across different playing styles and skill levels, and expand the range of supported instruments. We plan to fully implement chord detection, add difficulty modes, scoring, and progression systems to make the game more engaging over time. We also want to explore multiplayer and community features, allowing players to share songs generated from their own identities. Ultimately, we hope to turn <Git />ar into a platform that makes music practice more personal, expressive, and fun.

Built With

Share this project:

Updates