Inspiration

Nearly 800,000 people in the US experience strokes, killing over 140,000 die every year. Majority of strokes block blood entry to the brain, resulting in serious paralysis and difficulty with motor control. Despite the growing number of people suffering from strokes every year, there are very few therapies targeting stroke rehabilitation. Our augmented reality app looks to utilize cutting edge technology to bring accessible tools to create a personalized regimen for users to regain mobility within the comfort of their own homes.

What it does

Patients are guided through a simple rehabilitation exercise composed of certain hand pose estimations known for increasing mobility. It guides the user by visualizing an augmented hologram hand and encourages them to follow along with the hologram. The application gives real-time feedback utilizing a machine learning model to detect how well the user has completed the exercise.

How I built it

We trained a machine learning algorithm with ~100 pictures through Azure Custom Vision. We used this model implemented this model with CoreML and Swift to accurately assess the position of the user's hand poses. Using ARKit, we created an augmented reality application that guides users to correctly perform certain hand poses.

Challenges I ran into

Developing our first ML model. Creating an augmented reality iOS app for the first time and using ARKit/Swift.

Accomplishments that I'm proud of

Training our own ML model, building our first mobile app, learning a new language.

What I learned

The basics on how to train an ML model, a new programming language Swift, and ARKit processes/workflow.

What's next for Stroke Saver

Training a more accurate data set for different poses and less biases. Better gamification to encourage users to complete daily exercises. Integration with the Stanford Stroke Center App.

Built With

  • arkit
  • azure-custom-vision
  • coreml
  • swift
Share this project:

Updates