Inspiration

Millions of patients — newborns, burn victims, post-surgery patients, and those experiencing depression or trauma — cannot verbally communicate their pain. Healthcare providers are left relying on subjective observation, which can be inconsistent and delayed. We wanted to build a system that gives a voice to those who can't speak for themselves, making invisible pain visible through data.

What it does

Cureo is a smart pain and emotional detection system that uses non-invasive biosensors to continuously monitor physiological signals — heart rate variability, skin conductance, facial muscle tension, body temperature, and brain wave patterns. An AI model combines these signals into a real-time pain and distress score, displayed on a color-coded dashboard (green/yellow/red) for caregivers. It also integrates with VR/AR therapeutic environments that dynamically adapt based on the patient's emotional state — calming the scene when stress rises, or introducing new activities when engagement increases.

How we built it

We designed Cureo around a multi-sensor data pipeline:

Wearable sensors (wrist bands, skin patches, optional EEG headband) collect raw physiological data AI model fuses signals into a unified pain/distress score using patterns from nociceptive responses (A-delta and C fiber signals, autonomic nervous system reactions) Dashboard interface visualizes real-time scores on tablets, hospital monitors, and mobile apps VR/AR layer uses the distress score to dynamically adjust the therapeutic environment

Challenges we ran into

  • Differentiating pain signals from general stress or anxiety, since physiological responses overlap significantly
  • Designing for vulnerable, non-communicative populations (infants, sedated patients) where traditional user testing isn't possible
  • Balancing data sensitivity and privacy — all physiological data required encryption and strict access controls
  • Avoiding alert fatigue for nurses and caregivers by only triggering notifications at meaningful thresholds ## Accomplishments that we're proud of
  • Designing a system that works across a wide spectrum of users — from newborns to adults with trauma or depression
  • Integrating physical pain detection with emotional wellbeing monitoring in a single platform
  • Building an adaptive VR therapy environment that responds in real time to patient biometrics
  • Creating an ethical framework that ensures PainSense supports, rather than replaces, clinical judgment ## What we learned
  • Pain is both physical and emotional — the limbic system and prefrontal cortex are just as involved as nociceptors, which shaped our multi-signal approach
  • Compassionate design requires thinking deeply about who can't participate in the design process
  • Real-time healthcare tools must prioritize clarity over data density — caregivers need signals, not noise ## What's next for Cureo
  • Clinical trials in postpartum wards and burn treatment centers to validate the AI model
  • Expanding the newborn monitoring module with specialized infant facial expression recognition
  • Deeper VR therapy integration for PTSD and trauma recovery programs
  • Pursuing FDA/CE medical device certification pathways
  • Partnering with hospitals to pilot the dashboard in real ICU and recovery room environments

Built With

  • figma
  • figmamake
Share this project: