Inspiration
Our project is inspired by exposure therapy, a method where individuals build confidence in social interactions through gradual, repeated practice. MediQuest helps users become more comfortable with conversations by simulating interactions of varying difficulty levels. Starting with a friendly and supportive virtual character, users can ease into conversations. As they progress, the character displays more complex emotions like frustration or sadness, allowing users to practice responding appropriately in diverse social scenarios.
What It Does
When users put on the Meta AR headset, they are immersed in a real-life environment with a virtual person in front of them. The character initiates a conversation, and a dialogue box appears with three response options. Users speak their chosen response aloud, pressing the start button to record their voice and the stop button to confirm. The conversation evolves dynamically based on their choices, creating a realistic and interactive experience.
How We Built It
We developed SocialEase AR using Unity, leveraging Meta’s AR toolkits to create immersive scenes and interactions. The 3D character was modelled and animated in Blender, then imported into Unity. By integrating speech recognition and an interactive dialogue system, we built an engaging and responsive environment for social practice.
Challenges We Ran Into
As beginners in Unity and AR development, we faced a steep learning curve. One of the biggest challenges was implementing speech-to-text functionality, which required processing recorded voice data into a binary array, converting it to a WAV file, and transcribing it into text—a complex task for newcomers. Additionally, exporting animations from Blender to Unity proved challenging, as we had to ensure proper rigging, scaling, and compatibility between the two platforms.
Accomplishments We’re Proud Of
Despite the challenges, we successfully built a working prototype that demonstrates our core idea. Seeing our vision come to life and knowing it could help people struggling with social anxiety has been incredibly rewarding.
What We Learned
This project taught us valuable skills in Unity, AR development, Blender, and speech recognition technology. We also gained a deeper understanding of how interactive experiences can be designed to support social skill development.
What’s Next for SocialEase AR
While our prototype is functional, there’s room for improvement. Here’s what we plan to add next:
Character Reactions & Animations: Adding facial expressions and body language to make the virtual character more lifelike.
Open-Ended Conversations: Allowing users to respond freely instead of choosing from preset options for more organic interactions.
Enhanced Environments & Audio: Improving background visuals and adding sound effects for greater immersion.
Feedback on Voice Signal Stability: Providing real-time feedback on voice clarity, volume, and confidence to help users improve their speaking skills.
With these enhancements, MediQuest can become an even more effective tool for helping individuals overcome social anxiety and build confidence in real-world interactions.
We also created an animation using Niantic to help individuals with social anxiety through meditation. The animation provides a calming, interactive space that encourages relaxation and mindful breathing. By combining soothing visuals with meditation techniques, it offers a tool for reducing stress and promoting emotional well-being. https://youtube.com/shorts/rBvB75Fwm4k?feature=share

Log in or sign up for Devpost to join the conversation.