Inspiration

The moment Bill entered McGill, he sighed and yawned.

"We won't have access to a gym for 24 hours", says Bill.

So we made him run the tests for Tracks.

What it does

Tracks tracks workout reps in real-time by using OpenCV and machine learning to monitor shoulder and wrist movements. It then adjusts the BPM of your music to match your workout intensity, seamlessly integrating with Spotify. The website automatically adjusts the music's tempo based on how fast or intense the user’s workout is, providing a more personalized and energetic experience. It’s free to use, and users can choose their music genre, while the app handles intensity adjustments.

How we built it

We first debated whether to make the application web-based or app-based. We settled on using Flutter for cleanliness and ease of use, and ended up making an app MVP before implementing the final website. The backend is entirely written in Swift, and the web socket connection allowing motion detection is written in Python while using OpenCV.

Challenges we ran into

One of the main challenges we encountered was midway through implementing the BPM conversion. We discovered that Spotify had deprecated the API we needed to find music based on BPM. However, we remembered our lord and savior, GumLoop, who generously granted us 15,000 tokens and a Premium account. With their help, we leveraged GumLoop’s AI features and its Perplexity block to find music for each BPM range, allowing us to continue and complete the feature.

Furthermore, our implementing hand gestures turned out to be a huge hassle because of managing exercises states through hand gestures using OpenCV. OpenCV's built-in depth detection turned out to be inaccurate due to the presence of only one camera on our demo device. The solution turned out to be intuitive. We simply made a conversion proportion at every frame: by knowing the user's face width and comparing it to the measured width in pixels by OpenCV, we could find out the actual dimensions of all other limbs and distances, regardless of distance from camera.

Another major issue was figuring out how to accurately convert reps per second to beat per minute to match with queried. It was a necessity, through trial and error, to convert movement rhythm to beat - the core of our project was here.

Accomplishments that we're proud of

We are proud to have been able to work out during the hackathon. Most testing resulted in buckets of sweat, despite the exercises starting off as simple push-ups.

What we learned

We learned a lot about the complexities of real-time applications, especially the complexities surrounding real-time information and image to and from the backend. Using websockets with Flutter and Dart and figuring out how to transmit frames efficiently turned out to be a great challenge and an even greater learning experience for us.

What's next for Tracks

Definitely more exercises! We're thinking of implementing complex weight training exercises ranging from the benchpress to cable machines on top of what we're currently able to support. Another feature would be expanding from just supporting Spotify to being able to accommodate the other big players in the music scene. Notably, Apple, YouTube and Amazon Music. We've adapted it for internal usage at Piñata Pitch.

Built With

Share this project:

Updates