Inspiration

Navigating the world as a visually impaired person often means tools are audio reliant or limited in the information they provide. We were inspired by the idea of making environmental awareness felt rather than seen or heard. By turning spatial information into haptic feedback, we wanted to create a solution that is intuitive, discreet, and empowering. We wanted something that works alongside existing mobility aids instead of replacing them.

What it does

Haptix is a wearable haptic vest that enhances spatial awareness for visually impaired users. Using ultrasonic sensors and AI-based computer vision, it detects nearby obstacles and objects and translates that information into localized vibration patterns on the user’s body. This allows users to understand direction, proximity, and urgency of obstacles in real time, without needing to look at a screen or constantly listen to audio cues.

How we built it

We built Haptix using a combination of hardware and software components:

  • Arduino to handle ultrasonic distance sensors and control vibration motors
  • Raspberry Pi connected to multiple cameras for real-time video capture
  • YOLOv8 for object detection and classification
  • A custom vibration “language” that maps distance and direction to different haptic patterns
  • Serial and network communication between devices to coordinate sensor data and feedback This modular setup allowed us to experiment rapidly while keeping the system flexible.

Challenges we ran into

One of our biggest challenges was latency. Processing multiple camera feeds with real-time object detection placed heavy strain on our setup, leading to delays that impacted responsiveness. Integrating multiple sensors, managing serial communication, and preventing data conflicts between systems was also difficult. Like many hackathon projects, we faced time constraints, hardware limitations, and last-minute debugging issues — but we pushed through and adapted our design accordingly.

Accomplishments that we're proud of

  • Successfully integrating computer vision and haptics into a single wearable system
  • Designing an intuitive vibration mapping that users can quickly learn
  • Getting real-time obstacle detection and feedback working on constrained hardware
  • Building a meaningful assistive technology concept under hackathon pressure
  • Creating a project that prioritizes accessibility, dignity, and independence

What we learned

We learned how challenging real-time perception systems are when working with limited compute resources. We gained hands-on experience with hardware-software integration, latency optimization, and designing for accessibility. Most importantly, we learned that effective assistive technology isn’t about adding more features: it’s about delivering the right information in the least intrusive way.

What's next for Haptix

Next, we want to:

  • Optimize performance using edge acceleration or cloud-assisted inference
  • Improve object prioritization and ground-level hazard detection
  • Add user customization through a web or mobile interface
  • Conduct real user testing with visually impaired individuals
  • Refine the vest’s ergonomics, comfort, and battery life

Our long-term goal is to turn Haptix into a practical, everyday mobility companion that complements existing assistive tools and adapts to each user’s needs.

Share this project:

Updates