Inspiration

We have contributed to various product developments focused on individuals with chronic conditions such as diabetes, heart disease, COPD, and cancer treatment.

Through direct conversations with patients, we discovered that many are too unwell to even leave their bed, feeling weak and sensitive to the point where contacting their doctor or using an app becomes a challenge. Yet, obtaining real-time insights into a patient's well-being and symptom severity is crucial for enhancing the care team's ability to tailor their support effectively.

We thought that a simple voice-enabled assistant that could record symptoms, severity, and blood sugar readings, offer over-the-counter remedies, or advise when to contact a doctor could solve this problem in many cases.

What it does

This voice-enabled Symptom Tracker would allow the user to log their overall distress rating, based on the nationally recognized "NCCN Distress Thermometer," as well as log specific symptoms that have been bothering them and the severity of each. These ratings currently get saved to an Airtable, but imagine they are going to their patient records, which the care team can access, and that would notify them if any red flag conditions appear.

We included an LLM option where the user can get recommendations on how to alleviate symptoms with non-medical, easy-to-do-at-home treatments. This could bring them immediate comfort, whereas alternatively, they may not have wanted to reach out to the doctor to ask for help.

The care team and patient could work together to set the timing that the Voice app prompts the user to record their current state, giving the care team the data they need at the desired frequency.

How we built it

One other colleague and I put this together using VoiceFlow and Airtable. The subject matter is drawn from experience and nationally recognized healthcare frameworks to track the progression of Distress and Symptom Severity.

Challenges we ran into

It took some time to figure out how to get things to post to Airtable. There were some missing pieces of data in the available tutorials. Still not sure how to pull the data from the spreadsheet to read back to the users. Also, getting the AI capture step to recognize the number ratings the user was speaking was harder than expected.

The more challenging part would be connecting this to an actual patient database that the data could get fed to properly.

Accomplishments that we're proud of

We were happy to achieve the empathetic tone in the copy and voice that the app uses. It feels like a friendly, compassionate, and encouraging coach during a tough time in the user's life. We think it will help millions of people be more mindful of their care and produce better outcomes. Also, the LLM addition that can respond in real-time to the user's problem is fantastic!

What we learned

We learned a lot about how to adapt to the tool to get it to do what we wanted. Posting to Airtable and offering users a defined list of symptoms to choose from, as well as saving the various pieces of helpful data, like blood glucose ratings, symptoms, severity, date and time.

We were able to direct the LLM to offer contextual recommendations based on previously provided patient data. and only recommend safe, over-the-counter remedies so as not to recommend anything dangerous to the user. Otherwise, it directs the user to their healthcare provider.

What's next for Voice Enabled, Remote Symptom Tracker

I am looking forward to testing it with more people to refine how we phrase things and how it adapts to user input that wasn't what it expected.

Built With

Share this project:

Updates