Inspiration
"Hey, hey, hey Just think, while you've been gettin' down and out about the liars And the dirty, dirty cheats of the world You could've been gettin' down to this sick beat"
- Taylor Swift
What it does
- Implements Lens Studio's hand tracking capabilities to create a gesture-based DJ beat-making AR Snapchat lens
How we built it
- Uses Lens Studio’s Hand Tracking module to detect and follow finger positions in real time
- JavaScript scripts map gestures (swipes, flicks, pinches) to DJ actions like triggering beats, adjusting volume, and activating scratch mode
- AudioComponent is programmatically started and modulated based on detected gestures
- Rave-inspired visual effects (VFX) are created using shaders and particle systems
- Core interactions use event-driven logic for responsive performance and a seamless AR experience
Challenges we ran into
- Iterating on an MVP while avoiding scope creep
- Not getting carried away with Lens Studio's host of features
Accomplishments that we're proud of
- Implementing the "scratch mode" feature, allowing you to seamlessly control audio level with the flick of a finger
- Scripting VFX that perfectly balance out audio to create a full immersive experience
What we learned
- Learning a new tool is hard, but rewarding!
What's next
- YOU using it on Snapchat!
Links: https://www.snapchat.com/unlock/?type=SNAPCODE&uuid=849f295570d0479f8705cf217529e32e&metadata=01
Built With
- anthropic
- javascript
- lens-studio
- typescript
Log in or sign up for Devpost to join the conversation.