About LumenSync 🌅 The Inspiration The idea for LumenSync came from a simple realization: we're all time travelers, but we can't sense time passing through our bodies.

After experiencing brutal jet lag on a trip across 8 time zones, I became fascinated by circadian rhythms—our body's internal 24-hour clock that regulates sleep, alertness, hormone production, and even mood. The science is clear: humans have an extrasensory perception of time called the suprachiasmatic nucleus (SCN), a cluster of ~20,000 neurons in the hypothalamus that acts as our biological pacemaker.

But here's the paradox: despite this sophisticated internal sensor constantly firing, we have no conscious awareness of it. We experience its outputs (fatigue, hunger, alertness) but never the rhythm itself.

What if we could make this invisible sense visible?

That question became LumenSync—an app that transforms circadian biology from an abstract concept into a tangible, trackable, optimizable sense.

🧠 What I Learned Circadian Phase Mathematics The core of circadian tracking is understanding phase shifts. When your biological clock (τ, typically ~24.2 hours) deviates from environmental time (24 hours), you experience a phase difference:

$$\Delta\phi = \phi_{biological} - \phi_{environmental}$$

Light exposure advances or delays this phase based on a Phase Response Curve (PRC):

$$\Delta\phi(t) = A \cdot \sin\left(\frac{2\pi(t - t_{max})}{24}\right)$$

Where:

$A$ = amplitude of phase shift (typically 1-3 hours) $t$ = time of light exposure $t_{max}$ = time of maximum sensitivity (~2-4 AM) Implementing this taught me that biological time isn't linear—it's a dynamic oscillator constantly entraining to external zeitgebers (time-givers).

The Multi-Axis Data Visualization Challenge Building dual Y-axis charts in Recharts for the Insights page revealed a critical UX lesson: different data types need different visual languages. Sleep hours (continuous) and phase differences (cyclical) can't share the same scale. By using yAxisId={1} and yAxisId={2}, I created visual clarity:

Voice as an Extrasensory Input Integrating the Web Speech API taught me that the best interface for tracking a sense is another sense entirely. Voice commands let users log experiences the moment they notice them—"I feel alert," "light session started"—creating a feedback loop that strengthens circadian awareness over time.

🛠️ How I Built It Tech Stack React 18 + TypeScript - Type-safe component architecture react-router (Data mode) - Multi-page navigation with nested routes Recharts - Data visualization with hardcoded xAxisId={1} and yAxisId={1} to fix React 18 Strict Mode double-render bugs motion/react - Smooth animations that mirror biological rhythms Sonner - Toast notifications for user feedback Web Speech API - Voice command recognition for hands-free logging localStorage - Client-side persistence (no backend required) Architecture Decisions

  1. Multi-Context Provider Pattern I built 9 custom React contexts to manage different aspects of circadian tracking:

// ... app components This created a clean separation of concerns—auth state doesn't pollute circadian data, and theme preferences persist across profile switches.

  1. Automatic Profile Creation The biggest UX innovation was eliminating the sign-in barrier. On first launch, LumenSync automatically creates a "My Profile" stored in localStorage:

const defaultProfile: UserProfile = { id: crypto.randomUUID(), name: "My Profile", initials: "MP", createdAt: new Date().toISOString(), }; Users start tracking immediately, lowering the activation energy for habit formation.

  1. Timezone Intelligence I built a database of 100+ major cities with their timezone offsets for the Jet Lag Planner feature. The system calculates optimal light exposure schedules based on destination timezone:

$$t_{exposure} = t_{current} + \frac{\Delta TZ}{2}$$

This pre-adapts your circadian phase before travel, reducing jet lag severity.

Key Features Implementation Voice Commands (25+ working commands):

const recognition = new webkitSpeechRecognition(); recognition.continuous = false; recognition.interimResults = false;

recognition.onresult = (event) => { const transcript = event.results[0][0].transcript.toLowerCase(); // Parse commands like "start light session", "log 8 hours sleep" }; Ideas Tracking System: 12 evidence-based circadian experiments (morning light exposure, caffeine timing, cold showers) with localStorage persistence to track "started" state and progress.

Photo Journal with Temporal Context: Each photo stores both the image and the time-of-day metadata, creating visual anchors that strengthen memory formation (another extrasensory function tied to circadian rhythms).

🚧 Challenges Faced

  1. React 18 Strict Mode + Recharts React 18's double-invocation in development mode broke Recharts animations. The solution? Hardcoding IDs:

This fixed the "duplicate key" warnings but required meticulous consistency across all chart components.

  1. AuthProvider Rendering Before Route Matching When implementing automatic profile creation, I hit a critical error: useAuth must be used within AuthProvider. The issue? React Router was instantiating components during route matching, before the provider tree rendered.

Solution: Changed from Component prop to element prop:

// ❌ Breaks - instantiated too early { path: "/", Component: ProtectedRoute }

// ✅ Works - rendered after providers { path: "/", element: }

  1. Visualizing Circular Time on Linear Axes Circadian rhythms are cyclical (24-hour repeating), but most charts are linear. How do you show someone their "phase" when it wraps around midnight?

I solved this by normalizing all timestamps to a 0-24 scale and using color gradients to indicate "biological night" vs "biological day":

const phaseColor = phase < -1 ? "red" : phase > 1 ? "blue" : "green"; Red = delayed phase (night owl), Blue = advanced phase (early bird), Green = synchronized.

  1. localStorage Limits & Profile Switching With 9 different features storing data (sleep logs, light sessions, reminders, photos, habits), localStorage quickly filled up. I implemented a profile-scoped namespace system:

const key = lumensync-${profileId}-sleepData; localStorage.setItem(key, JSON.stringify(data)); When switching profiles, app data clears but profile metadata persists—giving each user a clean slate while preserving multi-user support.

🎯 The Extrasensory Connection Every design decision in LumenSync serves one goal: making the invisible, visible.

Charts turn neural oscillations into visual patterns Voice commands let users externalize internal sensations Timezone maps make jet lag predictable instead of mysterious Photo journals anchor abstract time to concrete memories Multi-profile support acknowledges that every person's internal clock is unique The app doesn't just track circadian rhythms—it creates a new form of temporal self-awareness. Users develop what I call "chrono-intuition": the ability to predict their energy levels, optimize their schedule, and sync with their biology instead of fighting it.

🌟 What's Next Future versions will include:

Wearable integration (Apple Watch, Oura Ring) for automatic light/sleep tracking ML-powered predictions using historical data to forecast tomorrow's energy curve Social features for family circadian synchronization Export to health platforms (Apple Health, Google Fit) But the core mission remains: empowering people to sense what they couldn't sense before.

Built With

Share this project:

Updates