ClearPath Project Story

Inspiration

As children of immigrants, we’ve seen firsthand how language barriers can make everyday life harder; at doctor’s offices, parent-teacher meetings, and family gatherings where not everyone speaks the same language. For deaf and hard-of-hearing individuals, these challenges are even greater. We wanted to create something that puts accessibility first, using hardware to solve a human problem: what if captions could appear right in your world, wherever you look?

What it does

ClearPath overlays real-time captions in your AR field of view on Meta Quest, making spoken conversations accessible for deaf and hard-of-hearing users. Companion phones join the same session and receive instant translations, enabling multilingual communication for families, caregivers, and colleagues. Features include AI-powered topic detection, high contrast mode for low vision, adjustable caption sizes, and text-to-speech output.

How we built it

  • WebXR for immersive AR on Meta Quest, with DOM overlay for captions
  • Deepgram API for real-time speech-to-text streaming
  • Node.js WebSocket server for multi-device room synchronization
  • MyMemory Translation API for support of 7+ languages
  • Self-signed HTTPS certificates for Quest browser compatibility

Challenges

Meta Quest’s browser has limited support for the Web Speech API, so we had to rely on Deepgram’s streaming STT, which was tricky to debug; especially with microphone permissions and no console access on the headset. Setting up HTTPS for local development was a hassle, since Quest requires a self-signed certificate and manual approval. Managing real-time translation across multiple companions with different languages also required careful WebSocket architecture.

What we learned

Building for accessibility means thinking about edge cases most developers overlook; high contrast needs, caption sizing, audio alternatives. AR isn’t just about games; it’s a powerful tool for inclusion.

Built With

Share this project:

Updates