Inspiration
Lucid was built around a simple idea: your body already knows when you've stopped learning. By turning your phone camera into a fully functioning biometric sensor, Lucid reads signals like heart rate, breathing, and blink patterns in real time to understand when you're in flow, frustrated, fatigued, or mentally checked out.
Instead of forcing productivity through fixed study tools, Lucid adapts uniquely to you, turning physiological feedback from your device into the ultimate context-aware productivity tool.
What it does
Lucid is a biometric-adaptive productivity platform that monitors focus in real time and uniquely adjusts your learning experience
- Reads heart rate, HRV, breathing rate, and blink patterns from a phone camera using the Presage SmartSpectra SDK
- Streams biometric data over WiFi from phone → laptop during a study session
- Builds a personal baseline during calibration and classifies cognitive states such as flow, frustration, fatigue, or distraction
- Adjusts the study experience with interventions like simplified explanations, flashcards, and break prompts
- Provides a biofeedback breathing reset using the user's live breathing waveform
- Generates a session report highlighting focus trends and stress points
How we built it
Lucid is built from three connected layers.
Biometric capture An Android phone uses the Presage SmartSpectra SDK to extract physiological signals directly from your camera feed. These metrics are streamed as JSON over a WebSocket connection or directly through USB.
Backend processing A Node.js + Express server receives the biometric stream, logs session data in SQLite, and exposes endpoints for adaptive content and session summaries. Local inference runs through Ollama using Qwen 2.5 to generate explanations and flashcards without external APIs.
Adaptive study interface A React frontend visualizes biometric signals in real time (graphs, metrics etc.). A cognitive engine compares incoming metrics against the user's baseline and maps them to mental states, triggering study interventions and breathing resets when needed.
Challenges we ran into
- Streaming biometric data over WiFi had latency spikes that messed with state detection. Had to build a smoothing buffer to keep things stable
- Raw biometrics are noisy. Heart rate alone doesn't tell you much. Getting accurate states meant combining and weighting multiple signals together
- Running Qwen 2.5 locally while processing a live biometric stream at the same time was heavy. Had to be smart about when we call inference so it doesn't block the pipeline
- Making the breathing biofeedback feel smooth and calming instead of laggy took more iteration than expected
Accomplishments that we're proud of
- Built a complete end-to-end biometric pipeline from phone camera to adaptive study interface
- Implemented a real-time cognitive state engine that reacts immediately to physiological changes
- Designed interventions that adapt the learning flow instead of interrupting it
- Ran the entire system locally, including AI-generated study explanations and summaries
What we learned
- Biometric signals are personal. A baseline heart rate means something totally different for two different people. Calibration isn't optional
- Less intervention is better. Early versions interrupted too much. The best version of Lucid is invisible when you're doing well
- Your body communicates way more than your conscious mind picks up on. Building this made that impossible to ignore
What's next for Lucid
- Train a machine learning model on biometric session data to improve cognitive state detection
- Expand biometric capture to computer webcams and iOS
- Add new adaptive interventions like dynamic pacing, difficulty adjustment, and quiz generation
- Introduce deeper analytics so students can understand long-term focus patterns helping people know when exactly they are at peak productivity
By Divyam and Ayush
Built With
- android
- express.js
- javascript
- kotlin
- node.js
- ollama
- presage-smartspectra-sdk
- qwen-2.5
- react
- sqlite
- typescript
- usb
- websocket


Log in or sign up for Devpost to join the conversation.