Inspiration

I wanted to democratize clinical-grade health metrics. Heart Rate Variability (HRV) is a powerful indicator of stress and recovery, but it’s usually locked behind expensive wearables like Apple Watch or Oura. I asked myself: "What if a smartphone camera was enough?" Inspired by the principles of Photoplethysmography (PPG), I set out to build Pulse Orb—a "Bio-Metric Triage" tool that turns any phone into a medical scanner, no extra hardware required. I wanted a UI that felt like a futuristic medical bay, centered around a living, breathing "Orb" interface, rather than just another sterile chart.

What it does

Pulse Orb is a web-based biosensor that measures heart health in real-time.

Optical Scanning: It uses the smartphone's rear camera and flash to detect blood volume changes in the fingertip (PPG).

Signal Processing: It calculates BPM (Heart Rate) and HRV (Stress/Recovery) using raw signal mathematics directly in the browser.

AI Analysis: It sends metrics to Dr. Gemini, providing an instant, context-aware health assessment ("You are stressed, try hydration").

Haptic Playback: It lets users "feel" their own heartbeat pattern through the phone's vibration motor.

Secure Vault: It encrypts and stores medical history in the cloud (Firebase) so users can track trends over time.

How I built it

I built Pulse Orb as a Progressive Web App (PWA) to ensure zero-friction access (no app store download needed).

Core Engine: I wrote a custom SignalProcessor in TypeScript that analyzes video frames at 60fps via an HTML5 Canvas, detecting subtle redness shifts invisible to the human eye.

Frontend: Built with React (Vite) and styled with my custom SASS "Bio-Glass" system (Glassmorphism + Neon).

Backend: Firebase handles Authentication (Anonymous & Email) and Firestore (NoSQL) for the data vault.

AI: I integrated Google Gemini 1.5 via API to generate the medical reports.

Hardware Control: I used the MediaStreamTrack API to take low-level control of the device torch (flash) and focus modes.

Challenges I ran into

The "Hardware Race Condition" was my biggest nightmare.

The "Blink and Die" Bug: On many Android devices, programmatically turning on the Flash would kill the video stream instantly. I had to engineer a "Gentle Start" sequence with specific timing delays (200ms) and a "Death Monitor" event listener to reboot the stream if the OS revoked permissions.

Signal Noise: Distinguishing a heartbeat from a shaky hand was difficult. I implemented (and then refined) a "Gatekeeper" algorithm to detect air/light leaks before recording data.

Desktop vs. Mobile: Since PCs don't have flashes, I built a custom "Device Guard" that blocks scanning on desktops but allows full access to the History Vault.

Accomplishments that I'm proud of

The "Haptic Heartbeat": I successfully mapped the user's recorded BPM to the Web Vibration API. Holding the phone and feeling your own pulse playback is a surreal user experience.

Clinical UI: I broke away from standard "Bootstrap" looks and created a custom "Void Black" & "Cyan Glitch" theme that feels premium and immersive.

Zero-Server Latency: By processing the video signal locally in the browser (using requestAnimationFrame), I achieved real-time graphing without sending heavy video files to a server.

What I learned

The Web is Powerful (but Fragile): Accessing hardware like the Torch via a browser is experimental and varies wildly between Chrome, Safari, and Firefox. I had to code defensively.

User Trust is Visual: Users didn't trust the scanner until I added the "Initializing..." spinner and the live graph. Seeing the data makes it real.

Concurrency: Managing async hardware states (Camera Start -> Flash On -> Stream Play) requires strict session management to prevent "zombie processes" from crashing the UI.

What's next for Pulse Orb

Telemedicine Export: Allowing users to generate a PDF report (which I have started!) and email it directly to a doctor.

Long-Term Trends: Using Gemini to analyze weeks of data to predict burnout before it happens.

Native App: Porting the core logic to React Native for better access to camera ISO/Exposure controls, which would further improve accuracy.

Built With

Share this project:

Updates