🏥 AI-Powered Health Monitoring for At-Risk Beneficiaries -- Challenge Dialogue
Built for ConuHacks X 2026
MedRedemption is a comprehensive AI-powered health monitoring platform designed to provide proactive healthcare intervention for elderly and at-risk individuals. The system combines wearable sensors(coming soon), real-time voice-based health consultations, automated clinic appointment booking, and guardian monitoring dashboards to ensure timely medical assistance.
Key Capabilities:
- 🎙️ Natural voice conversations with AI health assistant (Google Gemini Live API)
- 🏥 Intelligent health triage with risk scoring and automated clinic booking
- 📱 Real-time sensor monitoring (heart rate, fall detection, activity tracking)
- 🗺️ Live guardian dashboard with beneficiary location and telemetry
- 📞 Automated clinic calling via AI voice agent (ElevenLabs + Twilio)
- 👥 Multi-beneficiary management for family guardians
┌─────────────────────────────────────────────────────────────────────────┐
│ MedRedemption Platform │
└─────────────────────────────────────────────────────────────────────────┘
┌──────────────────────┐
│ Android Sensor │ Kotlin + Jetpack Compose
│ (Beneficiary App) │ - Movesense BLE Wearable (HR + IMU)
│ │ - Phone Sensors (Fallback)
│ 📱 ❤️ 🏃 │ - Voice Interface
└──────────┬───────────┘
│ Socket.IO (Telemetry)
│ WebSocket (Voice Audio)
▼
┌──────────────────────────────────────────────────────────────────────────┐
│ Python Backend (Flask + SocketIO) │
│ ┌────────────────┐ ┌──────────────┐ ┌─────────────────────────────┐ │
│ │ Gemini Live API│ │ DeepAgent │ │ ElevenLabs Integration │ │
│ │ Voice Assistant│─▶│ Health Triage│─▶│ (Clinic Calling) │ │
│ │ (2.5 Flash) │ │ (GPT-4o) │ │ │ │
│ └────────────────┘ └──────────────┘ └─────────────┬───────────────┘ │
│ │ │ │ │
│ ▼ ▼ ▼ │
│ ┌──────────────────────────────────────────────────────────────────┐ │
│ │ MongoDB Atlas (MedRedemption DB) │ │
│ │ • users • beneficiaries • call_logs • sensor_data │ │
│ └──────────────────────────────────────────────────────────────────┘ │
└────────────────────────────┬──────────────────────────┬─────────────────┘
│ │
│ Socket.IO (Real-time) │ REST API
▼ ▼
┌──────────────────────────┐ ┌─────────────────────────┐
│ Go Telephony Server │ │ Next.js Frontend │
│ (Twilio Integration) │ │ (Guardian Dashboard) │
│ │ │ │
│ • Call Management │ │ 🗺️ Live Location Map │
│ • Audio Transcoding │ │ 📊 Real-time Telemetry│
│ • Fall Detection (DSP) │ │ 📞 Call Logs │
│ • Behavior Prediction │ │ 👤 Beneficiary Mgmt │
└──────────────────────────┘ └─────────────────────────┘
☎️ Twilio API 💻 Web Dashboard
│
▼
📞 Clinic Calls
┌─────────────────────────────────────────────────────────────────────────┐
│ Beneficiary Health Consultation │
└─────────────────────────────────────────────────────────────────────────┘
1️⃣ Beneficiary speaks
│ "I've been having chest pain and shortness of breath..."
▼
2️⃣ Audio Stream (PCM 16kHz)
│ Android App(prototype is next or python) ──WebSocket──▶ Flask Backend
▼
3️⃣ Google Gemini 2.5 Flash (Live API)
│ • Real-time speech recognition
│ • Natural conversation
│ • Symptom extraction
▼
4️⃣ DeepAgent Health Triage (Azure GPT-4o)
│ Input: Transcript + Medical Context
│ Output:
│ • Risk Score (200-600)
│ • Alert Code (OK, Monitor, Book Routine, Emergency)
│ • Appointment Request Detection
│ • Recommended Actions
▼
5️⃣ Decision Tree
├─ Risk < 400 → Log conversation, monitor
│
└─ Risk ≥ 400 → Clinic Search + Automated Calling
│
▼
6️⃣ Tavily API / Overpass API
│ Search nearby walk-in clinics (geocoded location)
▼
7️⃣ Go Server + Twilio + ElevenLabs
│ • Initiate outbound call to clinic
│ • AI agent (ElevenLabs Conversational AI) speaks
│ • μ-law audio transcoding (8kHz)
│ • Schedule appointment on behalf of beneficiary
▼
8️⃣ Logging & Notification
│ • Save to MongoDB call_logs collection
│ • Real-time Socket.IO emit to Guardian Dashboard
│ • SMS notification via Twilio (optional)
▼
9️⃣ Guardian Receives Alert
│ Dashboard shows:
│ - New call log entry
│ - Risk assessment details
│ - Clinic appointment info
│ - Real-time beneficiary location on map
Location: /backend
| Technology | Purpose |
|---|---|
| Flask | Web framework with REST API |
| Flask-SocketIO | Real-time WebSocket communication |
| Google Gemini 2.5 Flash | Live voice conversation AI |
| Azure GPT-4o | Health triage and symptom analysis |
| Groq Llama-3.3-70b | Alternative LLM for clinic search |
| DeepAgents | Structured AI agent framework |
| LangChain | LLM orchestration |
| ElevenLabs Conversational AI | Voice synthesis for clinic calls |
| Twilio SDK | SMS and telephony integration |
| PyMongo | MongoDB database driver |
| Flask-JWT-Extended | JWT authentication |
| bcrypt | Password hashing |
| Tavily API | Web search for clinic locations |
| Geocoder | IP-based geolocation |
| Google Speech Recognition | Audio transcription |
Key Files:
- app.py - Main Flask application with SocketIO, JWT config, CORS
- agent.py - DeepAgent health triage with risk scoring
- gemini_live_api_user_end.py - Gemini Live API voice integration
- connect_eleven_agent.py - ElevenLabs WebSocket bridge
- auth.py - Authentication routes and JWT handling
- db.py - MongoDB connection and helpers
Location: /frontend
| Technology | Purpose |
|---|---|
| Next.js 16 | React framework with App Router |
| React 19 | UI library |
| TypeScript | Type-safe JavaScript |
| Tailwind CSS 4 | Utility-first styling |
| Radix UI | Accessible component primitives |
| Socket.IO Client | Real-time WebSocket client |
| Leaflet + React-Leaflet | Interactive maps (OpenStreetMap) |
| Vitest | Unit testing framework |
| React Testing Library | Component testing |
Key Features:
- Guardian dashboard with real-time telemetry display
- Beneficiary management (CRUD operations)
- Live location tracking with interactive maps
- Call log history with clinic interactions
- Medical context editor for each beneficiary
- Dual authentication system (Guardian/Beneficiary roles)
- Protected routes with JWT-based auth
Key Files:
- app/dashboard/[beneficiaryId]/page.tsx - Guardian dashboard
- components/auth-guard.tsx - Route protection
- lib/api.ts - API client with auth helpers
- lib/socket.ts - Socket.IO singleton
Location: /server
| Technology | Purpose |
|---|---|
| Go 1.25.5 | High-performance server language |
| Gorilla WebSocket | Bidirectional WebSocket streaming |
| Twilio SDK | Telephony integration (voice calls) |
| MongoDB Driver | Database persistence |
| go-dsp | Digital signal processing (FFT) |
| godotenv | Environment configuration |
Capabilities:
- Twilio Call Management: Initiate outbound calls, stream audio (μ-law 8kHz)
- Fall Detection: Analyze accelerometer + gyroscope for fall patterns
- Behavior Prediction: FFT-based running detection, posture analysis
- Audio Transcoding: Convert between PCM, μ-law, base64 formats
- Real-time Sensor Processing: Process IMU data at 50Hz
Key Files:
- main.go - HTTP server with WebSocket endpoints
- pkg/callSystem.go - Twilio integration
- pkg/AbnormanlityDetector.go - Fall/posture detection
- pkg/PredictUserbehavior.go - Activity classification
Location: /sensor/android-bridge
| Technology | Purpose |
|---|---|
| Kotlin | Android development language |
| Jetpack Compose | Modern declarative UI |
| Movesense BLE SDK | Wearable device integration |
| Socket.IO Android | Real-time telemetry streaming |
| Android SensorManager | Phone sensor access (fallback) |
Features:
- Movesense Wearable Integration:
- Heart rate monitoring (BLE service UUID
180D) - 6-axis IMU (accelerometer + gyroscope)
- 50Hz sampling rate
- Heart rate monitoring (BLE service UUID
- Phone Sensor Fallback: Uses device accelerometer/gyroscope when wearable unavailable
- Real-time Telemetry: Streams HR + IMU data via Socket.IO (10Hz)
- Voice Interface: Microphone access for voice consultations
- Sensor Fusion: Intelligently merges Movesense and phone sensor data
Data Flow:
Movesense Wearable (BLE, coming soon) ──┐
├──▶ TelemetryPipeline ──▶ SocketBridge ──▶ Backend
Phone Sensors (Fallback, coming soon) ──┘
Database Name: MedRedemption
{
_id: ObjectId, // Unique identifier
name: String, // Guardian's full name
email: String, // Email (unique, used for login)
password: String, // bcrypt/scrypt hashed password
created_at: DateTime // Account creation timestamp
}{
_id: ObjectId, // Unique identifier
owner_email: String, // Reference to guardian (users.email)
username: String, // Beneficiary's name
email: String, // Beneficiary email (unique)
context: String, // Medical history/context (free text)
password_hash: String, // Hashed password for beneficiary login
created_at: DateTime // Profile creation timestamp
}Relationships: owner_email → users.email (many-to-one)
{
_id: ObjectId, // Unique identifier
beneficiaryId: String, // Reference to beneficiaries._id
guardianId: String, // Reference to users._id
timestamp: DateTime, // Call initiation time
description: String, // Call summary/transcript
duration: Number, // Call duration in seconds
clinic_name: String, // (Optional) Clinic contacted
appointment_status: String // (Optional) "scheduled", "pending", etc.
}Relationships:
beneficiaryId→beneficiaries._idguardianId→users._id
{
occurancetime: DateTime, // Timestamp of sensor reading
useremail: String, // Reference to beneficiaries.email
girodata: { // Gyroscope data (degrees/second)
x: Number,
y: Number,
z: Number
},
accelerationdata: Number // Acceleration magnitude (m/s²)
}Purpose: Raw sensor data for abnormality detection analysis
{
occurancetime: DateTime, // Timestamp of analysis
isrunning: Boolean, // Running activity detected
posturestatus: String, // "standing", "sitting", "lying"
isfalldetected: Boolean, // Fall event detected
useremail: String // Reference to beneficiaries.email
}Purpose: Processed sensor insights and alerts
- Python: 3.11 or higher
- Node.js: 18 or higher
- Go: 1.25 or higher
- Android Studio: Latest version (for Android app)
- MongoDB Atlas Account: For database hosting
- API Keys Required:
- Google Gemini API
- Azure OpenAI (GPT-4o)
- Groq API
- ElevenLabs API
- Twilio Account (SID + Auth Token)
- Tavily API
# MongoDB
MONGO_DB_URI=mongodb+srv://<username>:<password>@cluster.mongodb.net/MedRedemption?retryWrites=true&w=majority
# AI APIs
GEMINI_API_KEY=your_gemini_api_key_here
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_API_KEY=your_azure_api_key_here
AZURE_OPENAI_DEPLOYMENT=gpt-4o
GROQ_API_KEY=your_groq_api_key_here
# ElevenLabs
ELEVEN_API_KEY=your_elevenlabs_api_key_here
ELEVEN_AGENT_ID=your_agent_id_here
# Twilio
TWILIO_ACCOUNT_SID=your_twilio_account_sid
TWILIO_ACCOUNT_PASS=your_twilio_auth_token
TWILIO_PHONE_NUMBER=+1234567890
# Search APIs
TAVILY_API_KEY=your_tavily_api_key_here
# Security
JWT_SECRET_KEY=your_random_secret_key_here_use_secrets.token_hex(32)
# Server Config
FLASK_ENV=development
FLASK_DEBUG=True# Backend API
NEXT_PUBLIC_API_URL=http://localhost:5000
NEXT_PUBLIC_SOCKET_URL=http://localhost:5000
# Optional: Analytics, maps, etc.
# NEXT_PUBLIC_MAPBOX_TOKEN=your_mapbox_token# MongoDB
MONGO_DB_URI=mongodb+srv://<username>:<password>@cluster.mongodb.net/MedRedemption?retryWrites=true&w=majority
# Twilio
TWILIO_ACCOUNT_SID=your_twilio_account_sid
TWILIO_ACCOUNT_PASS=your_twilio_auth_token
# Server Config
SERVER_URL=http://your-ngrok-url.ngrok.io # For Twilio webhooks (use ngrok in dev)
PORT=8080# Socket.IO
SOCKET_URL=http://10.0.2.2:5000 # Use 10.0.2.2 for Android emulator (localhost proxy)
# For physical device, use your computer's IP: http://192.168.x.x:5000
# Movesense
MOVESENSE_DEVICE_NAME=Movesense # Default device name to connect tocd backend
# Install dependencies (choose one)
pip install -r requirements.txt
# OR with uv (faster)
uv sync
# Run development server
python app.py
# Server starts on http://localhost:5000Ports:
- Flask API:
5000 - SocketIO:
5000(same server)
cd frontend
# Install dependencies
npm install
# OR
yarn install
# Run development server
npm run dev
# Server starts on http://localhost:3000Access: Open browser to http://localhost:3000
cd server
# Download dependencies
go mod download
# Run server
go run main.go
# Server starts on http://localhost:8080For Twilio webhooks (development):
# Install ngrok
brew install ngrok # or download from ngrok.com
# Start tunnel
ngrok http 8080
# Copy https URL to SERVER_URL in .env
# Example: https://abc123.ngrok.iocd sensor/android-bridge
# Build project
./gradlew build
# Or open in Android Studio:
# File → Open → Select android-bridge directory
# Run on device/emulatorRequirements:
- Movesense wearable device (optional, will fallback to phone sensors)
- Android 8.0+ (API level 26+)
- Bluetooth permissions for BLE
- Location permissions for sensor access
- Natural conversation about symptoms via Google Gemini 2.5 Flash
- Real-time speech recognition and response generation
- Conversational memory across session
- Automatic clinic appointment booking when symptoms require medical attention
- Heart Rate Tracking: Continuous HR monitoring via Movesense BLE sensor
- Fall Detection: Accelerometer + gyroscope analysis with ML patterns
- Activity Recognition: Running, walking, posture transitions
- Phone Sensor Fallback: Seamless switch to phone IMU when wearable unavailable
- Live Location Map: See beneficiary's current location (Leaflet + OpenStreetMap)
- Telemetry Visualization: Real-time heart rate and IMU sensor data
- Call History: Complete log of AI-assisted clinic consultations
- Medical Context Editor: Maintain up-to-date medical history for each beneficiary
- Real-time Updates: Instant notifications via Socket.IO when events occur
- Link and monitor multiple beneficiaries under one guardian account
- Separate profiles with individual medical contexts
- Quick switch between beneficiary dashboards
- Individual call logs and telemetry per beneficiary
- Symptom Analysis: DeepAgent processes conversation transcript
- Risk Scoring: Assigns risk score (200-600 scale)
- 200-299: Normal (OK)
- 300-399: Monitor
- 400-499: Book Routine Appointment
- 500-599: Urgent Care Needed
- 600+: Emergency (Call 911)
- Alert Codes: Categorizes severity (OK, Monitor, Book Routine, Emergency)
- Appointment Detection: Identifies when beneficiary requests medical help
- Location-based Search: Finds nearby walk-in clinics using Tavily/Overpass API
- AI Voice Agent: ElevenLabs Conversational AI initiates call via Twilio
- Appointment Scheduling: Agent speaks with clinic receptionist to book appointment
- Guardian Notification: Real-time alert sent to dashboard with call details
- Call Logging: Full transcript and metadata saved to MongoDB
1. Free-fall Detection: Acceleration < 4 m/s² for >100ms
2. Impact Detection: Acceleration > 25 m/s² spike
3. Rotation Analysis: Gyroscope magnitude > 200°/s
4. Post-fall Stillness: Low movement after impact
→ Alert triggered if all conditions met
- Standing/Sitting Detection: Gyroscope Y-axis thresholds
- Transition Alerts: Sudden posture changes logged
- FFT-based Running Detection: Frequency domain analysis of gait patterns
Scenario: Elderly beneficiary (Alice) lives alone, monitored by her daughter (Guardian).
- Morning Routine: Alice's Movesense wearable streams HR (70 bpm) + IMU data to backend
- Guardian Check: Daughter logs into dashboard, sees Alice's location on map (home)
- Symptom Onset: Alice feels chest pain, opens Android app, speaks to AI:
- "I've been having sharp chest pain for 20 minutes and feel short of breath"
- AI Analysis:
- Gemini transcribes and responds with empathetic questions
- DeepAgent assigns Risk Score: 520 (Urgent Care)
- Alert Code: "Seek Immediate Medical Attention"
- Automated Response:
- Backend searches clinics within 5km radius
- Finds "City Walk-In Clinic" (2.3km away)
- Go server + ElevenLabs agent calls clinic
- AI books appointment for Alice in 1 hour
- Guardian Alert:
- Dashboard shows new call log entry
- SMS sent: "Alice needs urgent care. Appointment booked at City Walk-In Clinic at 10:30 AM"
- Live map updates with clinic location marker
- Follow-up: Guardian calls Alice, arranges transportation to clinic
| Method | Endpoint | Purpose |
|---|---|---|
POST |
/api/auth/register |
Create new guardian account |
POST |
/api/auth/login |
Authenticate guardian/beneficiary |
POST |
/api/auth/logout |
Invalidate session |
GET |
/api/auth/verify |
Verify JWT token |
GET |
/api/beneficiaries |
List beneficiaries for guardian |
POST |
/api/beneficiaries |
Create new beneficiary profile |
PUT |
/api/beneficiaries/:id |
Update beneficiary info/context |
DELETE |
/api/beneficiaries/:id |
Remove beneficiary |
GET |
/api/call-logs/:beneficiaryId |
Get call history |
SocketIO Events:
telemetry- Real-time sensor data broadcastaudio_chunk- Voice conversation audio streamingcall_log_update- New clinic call notificationlocation_update- Beneficiary GPS update
| Method | Endpoint | Purpose |
|---|---|---|
GET |
/newconnection |
Get session ID for Twilio call |
GET |
/ws/twilio |
Twilio audio stream WebSocket |
GET |
/ws/agent |
ElevenLabs agent WebSocket |
POST |
/sendsms |
Send SMS notification |
POST |
/abnormalitydetector |
Process sensor data for falls |
Built for: ConuHacks X 2026
Category: Health Tech / AI + IoT
Timeline: 48-hour hackathon sprint
Disclaimer: This is a prototype/proof-of-concept built for educational purposes. Not intended for production medical use without proper clinical validation, regulatory approval, and professional medical oversight.
- Password Hashing: bcrypt for guardian accounts, scrypt for beneficiary accounts
- JWT Authentication: Secure token-based auth with expiration
- CORS Configuration: Restricted origins for API access
- Environment Variables: Sensitive keys stored in
.envfiles (never committed) - HTTPS Required: Production deployment must use SSL/TLS
- Database Security: MongoDB Atlas with IP whitelisting and authentication
This project was created for ConuHacks X 2026. All rights reserved by the development team.
Built with ❤️ by the MedRedemption team at ConuHacks X 2026.
- Google Gemini for Live API access
- Azure OpenAI for GPT-4o integration
- ElevenLabs for conversational AI technology
- Twilio for telephony infrastructure
- MongoDB for Atlas database hosting
- Movesense for wearable sensor hardware
- ConuHacks X organizers and sponsors
🚀 MedRedemption - Proactive Healthcare Through AI
Monitoring today, preventing tomorrow