Inspiration

Clinicians spend more time typing notes than talking to patients, and critical details get lost in the process. We were inspired by the Axxess mission to make care delivery smarter — and saw an opportunity to use AI to turn real doctor-patient conversations into structured, actionable medical records automatically, without the clinician typing a single word.

What it does

HealthSafe is a full-stack diagnostic assistant that captures live clinical conversations and transforms them into structured medical records.A clinician starts a session and a QR code is generated — the patient scans it to instantly join the room. It records and transcribes doctor-patient sessions in real time using Deepgram's speech-to-text engine. When a session ends, an AI agent pipeline automatically extracts symptoms, duration, severity, and vitals like fever and blood pressure, maps to a real ICD-10 diagnostic code using semantic vector search, enriches the record with drug side effects and generates a short patient-friendly summary. Every session — including the full transcript, audio recording, and AI-generated diagnostic result — is stored in Supabase so clinicians and patients can access their complete history at any time

How we built it

The backend is a Node.js and TypeScript Express server with two layers. Sessions are created via a room system where a QR code encodes the session join link . The real-time layer uses Deepgram's WebSocket API for live transcription , managed through a session-based architecture where each consultation gets its own isolated runtime. Audio is recorded chunk by chunk, assembled into a file, and saved to Supabase Storage organized by room and session. Transcripts are stored in Supabase Storage as text files, and session metadata — clinician and patient identifiers, session status, and timestamps — is persisted in PostgreSQL hosted on Supabase. The AI layer is a multi-step agentic pipeline — sequential LLM calls via Featherless-hosted LLaMA handle extraction and summarization, but critically the pipeline does NOT ask the LLM to guess ICD-10 codes. Instead we embed the extracted symptoms and run a semantic vector similarity search against a database of real ICD-10-CM codes stored in Supabase with pgvector. The frontend is React with TypeScript featuring live transcript views, QR code session joining, a dashboard showing past sessions per user, and authentication via Supabase Auth.

Challenges we ran into

LEARNING SUPERBASE IN ONE DAY AND SPEECH TO TEXT

Accomplishments that we're proud of

We built a fully working end-to-end WEBAPP — from a patient scanning a QR code all the way to a structured ICD-10 coded, drug-enriched medical record stored in Supabase — in a single hackathon. The QR code join flow makes the product feel genuinely usable in a real clinical setting, not just a demo. The ICD vector search architecture is something we're genuinely proud of — it's a meaningful technical improvement over prompting an LLM to guess, and it produces codes guaranteed to be real. Every session is fully persisted — audio recording, transcript, and AI diagnostic result — meaning HealthSafe isn't just a demo that works once, it's a system with real longitudinal patient history. The agentic pipeline gracefully degrades at every step with typed fallbacks so even if one LLM call fails the system returns a partial result rather than crashing.

What we learned

Chaining LLM calls as an agent requires far more defensive coding than a single prompt — each step can fail independently and the output of one feeds the next. We learned that the most dangerous failure mode in healthcare AI is silent confidence — a model that returns a wrong ICD code with high confidence and no way to verify it. Replacing that step with vector search was the right architectural and clinical decision. We also learned that the session join experience matters as much as the AI pipeline — if getting into the room is confusing or slow, nothing else matters. On the product side, the most valuable AI feature isn't the fanciest model — it's reducing the friction between the clinical moment and the permanent medical record.

What's next for HealthSafe

We want to add real-time diagnostic hints that surface during the conversation, not just after it ends. We're planning to expand the ICD-10 vector database from 175 codes to the full 70,000+ code set for production-grade coverage. Direct EMR integration via FHIR APIs would let generated records push straight into existing clinical workflows without any manual step. Multilingual support using Deepgram's language models is a priority for serving diverse patient populations. We also want to build a clinician-facing history dashboard that surfaces trends across a patient's sessions over time — something Supabase's query layer makes very natural to build — and add a clinician review and sign-off flow so every AI-generated record is explicitly approved before it's finalized, keeping the human in the loop where it matters most.

Built With

Share this project:

Updates