Skip to content

wonhochoi1/treehacks-signifeye-app

Repository files navigation

Signifeye

Signifeye aims to create convenience for patients with visual disabilities, one sense at a time. Code entails hardware coding for the Arduino boards along with an accomodating React Native application that can be used by both the patients and patient assistants with haptic feedback manipulation. Developed as part of the TreeHacks 2023.

Signifeye Flowchart Image 1. The app itself entails a remote control and dashboard for patient tracking, with the remote control side of the app focusing more on haptic feedback for the patient. Signifeye Gallery

Image 2. The mockup of the actual hardware that uses ultrasonic sensors on the glasses that the patient would be wearing. The sensors are coded to different auditory tones that the patient would hear.

About

Project developed for TreeHacks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors