Inspiration

The idea for HapticChat started with one important question:

How can people communicate when they cannot see or hear?

I was deeply inspired by the challenges deaf-blind people face every day. Many rely on expensive tools , I wanted to change that.

My goal was to create something simple and powerful — using a tool most people already have: a smartphone. I decided to use Morse code and vibrations to help people communicate through touch, in real time, without needing extra devices.


What it does

HapticChat is an iOS app that lets deaf-blind users and caregivers communicate using Morse code vibrations.

In User Mode, deaf-blind users can tap Morse code messages and even hear their surroundings converted into Morse code.
In Caregiver Mode, caregivers type messages, and the app plays them back as vibration patterns the user can feel.

The app also includes a Learn Morse Code section where users can practice tapping and feeling letters to build confidence.

HapticChat turns a smartphone into a touch-based communication tool — no extra devices, no internet needed. It’s a simple, accessible way to break communication barriers and bring people closer through the power of touch.


How I built it

I built HapticChat using SwiftUI for iOS (iPhones).

  • I used the iPhone haptic engine to create meaningful vibration patterns.
  • I managed data using StateObject and EnvironmentObject.
  • I saved user preferences and progress with AppStorage.

I focused on making the app simple, easy to use, and mainly driven by touch — no need for sight or sound.


Challenges I ran into

I faced several challenges:

  • Getting the vibration patterns just right took many tries.
  • Making sure new users could understand and learn quickly was important.
  • I could not test directly with deaf-blind users, so I had to research and test carefully.

Each challenge pushed me to improve and build a better app.


Accomplishments that I'm proud of

  • I successfully created an app that allows deaf-blind users and caregivers to communicate through touch, using only a smartphone — no extra devices needed.

  • I carefully tuned the haptic feedback to feel natural, clear, and easy to understand — a technical challenge I’m proud to have solved.

  • Most importantly, I am proud to have built something that can help people feel more connected, independent, and included.


What I learned

  • I learned the importance of empathy in design — how thinking deeply about the real needs of deaf-blind users shaped every part of the app.

  • I discovered how small technical details, like the timing and strength of vibrations, can completely change how a message is felt and understood.

  • I learned how to combine old ideas (Morse code) with modern technology to create something useful, accessible, and meaningful.

  • I improved my skills in SwiftUI, state management, and using the iPhone’s haptic engine to deliver precise feedback.


What's next for HapticChat

  • User testing with the deaf-blind community to gather real feedback, improve usability, and make sure the app truly meets their needs.

  • Integrate with wearables (like smartwatches) to expand touch-based communication beyond the phone and offer more flexibility.

Built With

  • acfoundation
  • corehaptics
  • speech
  • swift
  • swiftui
Share this project:

Updates