Inspiration
We started with the idea of synesthesia... what if you could see music? We wanted to translate audio into motion graphics where faster music creates sharper shapes and brighter colors, and different frequencies map to different visuals.
Navigation apps like Google Maps are purely functional. There's no sense of discovery or wonder. What if we could combine music visualization with walking directions to make getting somewhere more enjoyable?
We also realized this could increase accessibility - for people who are deaf or hard of hearing, BeatMap lets them experience music visually while navigating to their destination.
What it does
BeatMap is a mixed reality navigation app for Meta Quest that turns walking into an immersive, music-driven experience. It overlays turn-by-turn navigation onto the real world through passthrough cameras while your Spotify music syncs with visual effects around you. Speak your destination, and watch as a vibrant path appears at your feet - pulsing and reacting to the beat of your music. Bass drops make the path glow brighter, the beat drives particle effects, and you arrive at your destination without ever looking down at a phone.
How we built it
- Styly for prototyping
- Unity + Meta XR SDK for the VR experience with passthrough cameras
- Companion mobile app (React Native/Expo) streams GPS coordinates and compass heading to the Quest via WebSocket
- Google Maps Directions API provides real walking routes with turn-by-turn instructions
- Spotify Web API + OAuth for music playback control and track info
- Custom Python backend (deployed on Railway) analyzes songs and extracts per-beat frequency data (bass, mids, treble)
- Groq AI interprets voice commands to extract destination addresses
- Native Android speech recognition for hands-free destination input
- Blender for modeling
- Figma for prototyping
Challenges we ran into
Spotify literally locked us out. Midway through development, we discovered Spotify disabled new developer app registrations for their API. We had to track down an already-approved app on someone else's account to authenticate against.
The Quest has no GPS. Meta Quest headsets don't have onboard GPS, which is a problem for a navigation app. Our solution was building a companion phone app that streams GPS coordinates and compass heading to the headset over WebSocket.
Path alignment was difficult. Converting GPS coordinates to VR world space sounds simple until you try it. Angles were always off, paths would drift, and the route never quite lined up with the real world. We spent hours debugging coordinate transformations, compass calibration, and drift correction to get paths that matched the streets beneath your feet.
Trying to integrate Unity UI with hand gestures was difficult, since they were completely different systems with functions and components that were not compatible together. A lot of the documentation was also outdated, making it harder to troubleshoot and create a UI system in the world space that users can interact with through hand gestures.
Trying to integrate heavier shaders using Shader Graph quickly ran into hardware limitations. Many of the visual effects that worked smoothly on desktop simply wouldn’t run on the Meta Quest due to GPU and performance constraints.
Getting the arrow to move in a way that felt intentional rather than chaotic was difficult. Directly applying it to transforms caused jittery, unreadable motion. We had to experiment with selectively responding to specific frequency bands so the arrow movement felt rhythmic, directional, and helpful rather than distracting
Accomplishments that we're proud of
Our project actually works outside in the real world. We also heavily iterated on the beat sync and frequency mapping until it actually felt connected to the music.
What we learned
Persistence. Every feature hit a wall: Spotify's API, GPS drift, path alignment. But pushing through blockers instead of pivoting resulted in a complete project.
What's next for BeatMap
We would like to build upon our selection of animations to further diversify experience possibilities. Beyond that, some ideas include:
- Genre-specific visualizations - different visual styles for hip-hop vs classical vs electronic
- Standalone GPS - eliminate the companion app once Quest gets native GPS support
- Wall anchors - anchor animations and other elements to the surfaces of buildings in real life as users walk by
Built With
- blender
- c#
- expo.io
- google-maps
- groq-ai
- meta-quest-3
- mruk
- python
- react-native
- spotify
- styly
- unity
Log in or sign up for Devpost to join the conversation.