An updated version of a childhood favorite. Hold up your hand, and a virtual mood ring appears on your finger. What mood will it detect?
Project Details
Role: Creator and Developer
Development Platform: Lens Studio
Published on: Snapchat
Programming Language: JavaScript
Features: Hand tracking, head and face tracking, facial expression recognition
Inspiration & Concept
Nostalgia drives some of my AR experiences. I love brainstorming ways to reimagine older analog toys and games in digital form.
Physical mood rings rely solely on color to indicate mood, but I wanted to lean into AR’s unique features to create more immersive visual representations of the different moods.
Development Process
This Snapchat lens is actually version 2.0. The first version was one of the first AR experiences I created with Lens Studio.
Why did I recreate it? Both Lens Studio and I have grown significantly since then. Lens Studio’s hand tracking has improved immensely, and I’ve gained much more experience. I was determined to honor my original vision.
The revised version was my first time fully embracing Lens Studio’s Asset Library—a great resource for 3D models, scripts, sound effects, and other assets that enhance your project. Since 3D modeling isn’t my strong suit, I was pleasantly surprised to find much of what I needed already in the library.
