Inspiration

Many patients with late stage neurodegenerative diseases (namely, ALS) have completely lost the ability to communicate as a by-product of losing virtually all voluntary muscular function, except for control of their eyes. Currently, these patients need around-the-clock 24/7 care from a live-in caretaker (or these patients must opt for specialized assisted living homes). This places an extreme physical, emotional, and financial stress on all parties involved: the family of the patient, the caretaker, and the patient themselves. Aside from being economically infeasible to subsidize 24/7 care and attentiveness, the patient loses any semblance of individuality or independence, being constantly "monitored" by their caretaker. Additionally, the caretaker must be alert and ready to assist for long periods of time, even while the patient is asleep. This is incredibly taxing, so our device attempts to leverage the patients only remaining voluntary movement, pupil movement, as a way for patients to communicate needs to their caretaker.

What it does

There is an biological electric potential between the cornea and the retina of the human eye which creates a measurable electric field within the eye. Our device detects eye movement using electrodes placed at various locations around the face to generate voltage data plots between the two respective electrode terminals. We leverage the inherent geometry of the eye's electric field, as controlled eye movement affects the electric flux between the two electrodes allowing for accurate and reliable detection of intentional patient "action."

How we built it

We building this by connecting the electrodes of an EMG (Electromyography) sensor to precise locations on the face. The sensor then detects signals, which are input directly to an Arduino Uno, after which the data is transmitted directly to a Raspberry Pi B+ (RPi). The RPi contains our custom software for data processing and categorizing input, which accurately differentiates between “natural” eye movement and intentional patient “communication.”

Challenges we ran into

Initially, we had trouble properly interfacing the EMG sensor to the Arduino, and difficulty recording and processing the signals. We also had trouble with the product design aspect of the challenge, moving from a mess of wires and sensors to a clean marketable product.

Accomplishments that we're proud of

It was really rewarding to learn basic anatomy and biology, and successfully leverage the knowledge to create a novel product with real-world applications. Furthermore, it was exciting to work on a project that can affect positive change in the world.

What we learned

We learned alot about the overall design process: finding a need, conceptualizing some type of solution to address that need, learning the basic science and familiarizing ourselves with the requisite knowledge of the problem, creating the hardware, and then converting a messy piece of hardware and circuity into a real aesthetic product.

What's next for EyeCom

We see many potential applications of our device; accurate and cheap eye tracking has many downstream uses in hospital settings, and our own device can be expanded for use with text-to-speech software.

Share this project:

Updates