Inspiration Health problems escalate fast. We wanted to give officers, people an edge — real-time physiological data on a subject's stress state, before things go wrong. What it does Health uses an iPhone camera to contactlessly measure pulse, breathing rate, and stress levels via rPPG. An ElevenLabs AI agent delivers live verbal de-escalation guidance. Every session is analyzed and logged. How we built it Swift/SwiftUI iOS app integrating Presage SmartSpectra SDK for biometric detection and ElevenLabs conversational AI for real-time voice guidance. Built in 6 hours at the hackathon. Challenges we ran into Getting Presage's headless processing pipeline working without their native UI was the hardest part — the SDK's recording gate only opens when statusCode == .ok. ElevenLabs session persistence required separating intentional disconnects from post-turn state changes. Accomplishments we're proud of Fully working contactless vitals detection feeding a live AI advisor — no wearables, no friction, no extra hardware. What we learned rPPG is incredibly sensitive to lighting and face positioning. SDK internals matter — reading source code saved hours of debugging. What's next for HealthCam Body cam integration, multi-subject tracking, department dashboard, and PTSD early-warning indicators for officers themselves.
Built With
- avfoundation
- combine
- coreimage
- elevenlabs-conversational-ai-sdk
- ios
- livekit
- presage-smartspectra-sdk-(rppg-biometric-engine)
- swift
- swiftprotobuf
- swiftui
- xcode
Log in or sign up for Devpost to join the conversation.