About the Project: What Was That?

Inspiration

What Was That? was inspired by our grandparents, who are hard of hearing and often spend long periods of time alone at home. We noticed that many everyday but important sounds such as doorbells, alarms, knocks, or someone calling out, were easy for them to miss. This wasn’t just inconvenient. It created anxiety for both them and our families, especially when no one else was around to notice what was happening.

While there are many smart home devices that rely on sound, they often assume the user can hear the alert itself. We wanted to rethink this assumption. Our guiding question became: how can we make sound-based awareness accessible to people who may not be able to hear those sounds at all?

What We Built

What Was That? is an accessibility-focused safety app designed for people who are hard of hearing and live alone. The app listens for meaningful sounds in the home and translates them into clear, tactile alerts.

When an important sound is detected, the user is notified through their phone and, when available, via vibrations on their smartwatch. This allows users to stay aware of what’s happening around them without relying on audio cues.

As an additional layer of support, we built a lightweight check-in and alert system. If a user doesn’t respond to a prompt or if no meaningful activity is detected over time, caregivers can be notified. This secondary feature complements sound detection by addressing moments when silence itself may signal a problem.

How We Built It

We designed the system around a simple but flexible event pipeline:

  • A mobile device acts as an acoustic sensor, monitoring audio levels locally.
  • When a meaningful sound is detected, the app generates an event.
  • That event triggers haptic notifications on the phone and connected wearables, as well as optional alerts for caregivers.

To respect privacy, the phone is treated as a triggered sensor, not a continuous recorder. Audio clips are short, optional, and only used to confirm events when necessary. By modeling both sound detections and missed check-ins as events, we were able to support multiple alert types using the same backend infrastructure.

Tech Stack

Built with: React Native and Expo for the mobile app, Node.js and TypeScript for the backend, Twilio for SMS and call alerts, and Vultr for cloud hosting. iOS development and testing were done using Xcode.

Challenges

One major challenge was ensuring reliability without overwhelming users. Sound detection that is too sensitive can lead to frequent false alerts, which quickly erodes trust. We had to carefully tune thresholds and focus only on sounds that genuinely matter.

Another challenge was designing for accessibility without making the experience feel clinical or stigmatizing. Because our primary users are hard of hearing, we prioritized haptic feedback, clear visual cues, and simple interactions, while avoiding language that felt alarming or patronizing.

Finally, we had to make hard scoping decisions. Advanced features like true fall detection require specialized hardware and extensive testing, so we focused on building a system that we could confidently demo and that directly addressed the problem we set out to solve.

What We Learned

This project taught us that accessibility is often about translation, not replacement. By translating sound into vibration and visual alerts, we were able to make existing signals usable for a wider group of people.

We also learned that restraint matters. A system designed to help should stay quiet unless something truly needs attention. By focusing on meaningful sounds and intentional alerts, What Was That? supports independence while still providing reassurance to users and their families.

Share this project:

Updates