.✦ ݁˖ Inspiration
Healthcare disparities affect millions of patients every year. From language barriers and implicit bias to financial constraints limiting access to specialists, the gaps in care are real and serious. Telehealth has grown rapidly, but most platforms are little more than video calls with basic scheduling. We asked: what if a telemedicine platform could actually think alongside the clinician and advocate for the patient at the same time?
That question became Telepathy.
⊹ ࣪ ˖ What It Does
Telepathy is a full-stack telemedicine platform that goes far beyond scheduling and video calls. It uses AI at every layer to give patients a real voice and give clinicians the context they need to make better, faster, and fairer decisions.
.✦ ݁˖ For Patients
Empowering patients to communicate their needs clearly and access the right care.
- Smart Doctor Search Location-aware search respecting state certification boundaries across 30 states, filterable by specialty, with an AI layer that factors in financial situation and medical history to recommend the most suitable providers — not just the nearest ones
- 3D Body Model Patients mark exactly where it hurts, describe pain type (sharp, dull, burning), rate severity, and add duration notes — giving the doctor a precise, structured picture before the appointment even starts
- AI Video Analysis Patients submit a video of their symptoms. The AI automatically scans the footage, extracts key frames, and surfaces the most clinically relevant images directly to the clinician alongside the original video
- Multilingual Support Real-time multilingual interface so language is never a barrier
- Full Visit History Complete records including transcripts, finalized clinical notes, lab results, immunization records, and After-Visit Summaries all in one place
⊹ ࣪ ˖ For Clinicians
Giving clinicians the context and AI support they need to deliver better, less biased care.
- Pre-Appointment Intelligence Before the visit, clinicians see the patient's pain annotations on the 3D model, AI-extracted video frames, financial context, bias reports, and suggested talking points — all generated automatically
- AI Video Review The system surfaces AI-extracted screenshots from the patient's submitted video alongside the full footage, so clinicians can zero in on what matters without scrubbing through manually
- Real-Time Diagnosis Heatmap A dynamically updating probability heatmap that shifts as the appointment progresses, giving clinicians a live second opinion without replacing their judgment
- Bias Detection Active monitoring during appointments to flag potential implicit bias in real time, helping ensure every patient gets equitable care
- Auto Transcription into EMR Converts the full appointment into a structured clinical note: chief complaint, HPI, symptoms, vitals, assessment, and plan
- Post-Appointment Follow-Up Priority indicators for critical follow-ups, rare disease correlation graphs, and pain-region annotations that surface connected body systems (e.g., foot pain flagging potential cardiovascular implications)
.✦ ݁˖ How We Built It
The frontend is built with Next.js for fast, clean routing between the patient and clinician portals. We used Three.js for the interactive 3D anatomical models, one of the most direct ways patients can communicate symptoms that language alone can't capture. The backend runs on Node.js with MongoDB handling records, transcripts, and clinical data. For the AI video analysis pipeline, we process patient-submitted footage to automatically extract clinically relevant frames, surfacing them to clinicians alongside the raw video. Additional AI services handle real-time transcription, multilingual processing, bias detection, and diagnosis hinting.
⊹ ࣪ ˖ Challenges We Faced
The AI video analysis pipeline was one of our hardest technical challenges. Automatically identifying which frames are clinically meaningful, rather than just evenly sampled, required careful tuning. Building two deeply different UX flows on the same data model also required architectural discipline from day one. And surfacing AI insights like bias reports and diagnosis probabilities in a way that informs rather than overrides clinical judgment was a design challenge we kept revisiting throughout the build.
.✦ ݁˖ What We Learned
The hardest part of building in healthcare isn't the technology. It's earning trust. Trust in the data, trust in the AI, and trust from the people using it on both sides of the appointment. Telepathy taught us that the best AI in this space is the kind that's invisible, it just makes everyone in the room a little more prepared, a little less biased, and a little more heard.



Log in or sign up for Devpost to join the conversation.