Inspiration
People living with Parkinson’s, like Will's Grandma, often struggle to monitor their condition consistently — from tremor intensity to medication adherence and symptom journaling. Most existing tools are either too manual, not tailored to their experience, or don’t integrate real-time input like speech. Typing and pressing small buttons can be very difficult and frustrating. We wanted to build a system that felt natural, non-invasive, and helpful, empowering users to track their condition with minimal friction.
What it does
Our app allows Parkinson’s patients to:
- Track tremor severity in real-time using accelerometer data and signal processing (frequency analysis on time domain signal)
- Use their voice to log symptoms and journal experiences — powered by AI speech-to-text
- Log medication usage, and link it directly to tremor events and journal entries
- Visualize symptom progression over time through clean, accessible charts
The experience is mostly voice-controlled, making it easier to use even during symptom flare-ups.
How we built it
- React Native with Expo for cross-platform mobile development
- SQLite (via
expo-sqlite) for local session and medication logging - DeviceMotion API for real-time tremor frequency and amplitude calculation
- Fast Fourier Transform (FFT) for signal analysis and tremor quantification (intensity and dominant frequency)
- OpenAI's Whisper (via AssemblyAI) for converting user speech into structured journal data
- Modular service architecture for handling database queries cleanly
Challenges we ran into
- Signal analysis: Accurately extracting tremor frequency and intensity from noisy accelerometer data was difficult.
- Speech integration: Balancing latency and accuracy when using real-time transcription while keeping the UI responsive.
- Voice-first UX: Ensuring the app is usable with minimal touch, which meant rethinking interactions, feedback, and session flow.
Accomplishments that we're proud of
- Built a real-time tremor analyzer using raw accelerometer signals and FFT
- Seamlessly integrated AI-powered journaling from speech input
- Created a voice-friendly, non-clinical user interface
- Fully linked symptom data, medication usage, and personal reflections into one timeline
What we learned
- How to perform signal analysis and frequency extraction with FFT on mobile devices using built in accelerometers.
- How to design and implement speech-first UX patterns
- How to use lightweight tools like sqlite and react-native to design functional mobile apps.
What's next for Parkinson’s App
- Adding alerts and reminders for medication based on trends and missed logs
- Training an ML model to correlate tremor patterns with medication effectiveness
- Integrating remote data sync so caregivers and clinicians can view insights
- Supporting multi-language voice input to make the app more accessible globally
Built With
- assembly-ai
- javascript
- react-native
- sqlite
Log in or sign up for Devpost to join the conversation.