Morpheus - Meta Ray-Bans ASL Tutor

Inspiration

We believe everyone deserves to be understood. While over 500,000 people use American Sign Language (ASL) as their primary language, there's still a huge gap in real-time communication between signers and non-signers. While voice-to-text is everywhere (think Siri or Alexa), ASL translation tech hasn't kept up. We wanted to change that, not just with translation, but by creating a patient, understanding tutor that helps bridge these communication gaps.

What it does

Morpheus is like having a friendly ASL interpreter and teacher in your browser. It can:

  • Capture signs through your Ray-Ban Metas (or any other device!) in real-time.

  • Recognize and categorize finger positions to determine ASL words and letters.

  • Give you instant feedback on your signing. Act as a patient tutor, helping you learn and practice ASL.

  • Keep the learning experience smooth and stress-free with a clean, simple interface.

How we built it

Think of Morpheus as having three main parts:

😀 The Face (Frontend):

Clean, simple web design using HTML/CSS that works on any device Real-time video feedback so you can see your signs Progress tracking that doesn't feel overwhelming

🧠 The Brains (Backend):

MediaPipe to track your hand movements A smart AI system we trained to recognize ASL signs A tutoring system powered by OpenAI that adapts to your learning style (Details of the Technical Stack used can be found on the 2nd Slide)

⚡ The Nervous System (Data & Communication):

Super fast connections so everything feels instant Careful error handling so things don't break when mistakes happen Smart data management to keep everything running smoothly

Challenges we ran into

Building Morpheus wasn't easy! We faced some tough challenges:

Speed:

Our largest challenge was ensuring everything worked in real-time without lag. The Meta Ray-Bans don’t even have an official developer environment, so had to do something similar to a “man in the middle attack”, with a ton of trial and error. We experimented across streaming platforms, video sharing, frame-by-frame capture. In the end, we found using OBS screen capturing of a Facebook Messenger call the fastest approach.

Other challenges included:

  • Training a model to understand signs accurately.
  • Creating a teaching style that's helpful but not frustrating.
  • Making sure the interface is welcoming to everyone, regardless of their ASL experience.

Accomplishments that we're proud of

Looking back, we're really proud that we:

  • Created something that actually helps people communicate better.
  • Built a tutor that's patient, understanding, and adapts to each person.
  • Made the whole experience feel natural and friendly.
  • Kept everything running smoothly across different devices.
  • Created a system that's ready to grow and improve.

Prizes:

  • Education Track Grand Prize Built an innovative ASL education platform combining real-time sign recognition, adaptive AI tutoring, and progress tracking to make learning more accessible and engaging.
  • Perplexity Hacking With Perplexity: Used Perplexity extensively during development to research ASL teaching methods, study gesture recognition algorithms, and understand best practices in language education.
  • Delve Most Intuitive UX: Created a clean, minimal interface that makes ASL learning approachable through real-time feedback and clear visual cues
  • LumaLabs Reimagining Visual Creation: Enhanced sign language recognition through advanced visual processing and real-time feedback visualization
  • Neo Most Likely to Become a Business: We’re bringing ASL technology to the 20th century––focusing first on young family members and friends of ASL users. Surprising to many, the sign language economy is worth at least $3B, with >$1B spent in just ASL interpreting. Our beachhead / first users would be young, tech-savvy members of Gen Z and Gen Alpha who want to learn sign language and the parents of those young people who want the same thing!
  • OpenAI Most Creative Use: Created an intelligent tutoring system that adapts its teaching style and provides personalized feedback based on each learner's progress
  • Pear VC Best Customer Insights: Built our platform based on extensive research and continuous feedback. We actually had a Zoom call with a potential customer mid-building (we hadn’t even finished our MVP yet!). We fundamentally believe in building fast and iterating faster.

What we learned

This project taught us about integrating advanced AI and computer vision with wearable technologies, something that none of us had done before. We also learned about the incredible complexity and beauty of ASL, and learned a ton of signs along the way.

What's next? Morpheus to the Moon!

We're just getting started! Here's what we're excited about:

  • Creating fun, engaging ways to practice ASL
  • Building a community where learners can help each other
  • Making it even better at understanding complex signs and phrases
  • Adding more ways to track your progress
  • Developing games and challenges to make learning more fun
  • Getting feedback from the ASL community to make Morpheus even better
  • We built Morpheus not just as a tool, but as a bridge between communities. While it's not perfect, we believe it's a step toward making communication more accessible for everyone.

Built With

Share this project:

Updates