Ora — Project Story
About the project
Ora started from a practical frustration: the gap between training hard and tracking consistently. Most workout apps do one of two things:
- They optimize for long-term analytics, but make the moment-to-moment logging experience slow.
- They simplify the UI, but still require enough taps and context switching that you “just log it later” (and then don’t).
Ora is my attempt to treat logging as a real-time systems problem: reduce latency at the moment of action (between sets), preserve determinism, and keep the user in control of their data. The north star is simple:
Logging should feel like sending a voice note, not filling out a form.
That is why Ora is local-first by default, voice-driven for capture, and intentionally minimal in UI. The app is designed to work offline, remain usable even with zero accounts, and only “scale up” into cloud features when the user explicitly opts in.
What inspired Ora
I care a lot about systems that reinforce discipline, not systems that require discipline to use. In the gym, attention is a limited resource: you’re tracking rest times, technique cues, progression decisions, and the social/physical environment. When an app demands too much cognition—searching exercises, navigating menus, typing numbers—it competes with the actual training.
So Ora was inspired by a behavioral design question:
- If the “tracking tax” is the reason consistency fails, can we make that tax asymptotically small?
I approached the product like I approach engineering: focus on the bottleneck. In this case, the bottleneck isn’t fancy analytics—it’s capture. If capture is frictionless, everything else becomes possible.
How we built it
1) Local-first core (privacy + reliability)
Ora stores its core data on-device using SQLite (via sqflite, with desktop FFI support). The design intent is that the app remains fully functional offline—because consistency is a daily habit, not a cloud service.
The database schema is structured around real workflows:
- Programs → Days → Exercises → Set plan blocks
- Sessions → Per-session exercises → Set entries
- Diet entries, appearance entries, profile/settings, etc.
Migrations are versioned so the app can evolve without breaking user history.
2) Deterministic training session logging (command bus + undo/redo)
A key architectural decision was to make session logging deterministic. Instead of “random UI state changes,” Ora treats session events as commands:
LogSet,SwitchExercise,StartRestTimer,Undo,Redo,FinishWorkout, etc.
This enables an explicit undo/redo stack, which is surprisingly important in real lifting scenarios (misheard voice input, wrong weight, accidental taps). It also makes the system easier to test and reason about.
3) Voice-first capture pipeline (STT → NLU → optional LLM → dispatch)
Voice logging is not “just speech-to-text.” The hard part is converting messy language into structured actions reliably. Ora’s voice pipeline is intentionally layered:
- Audio capture (cross-platform mic handling)
- Local speech-to-text (Vosk for offline STT)
- Rule-based fixes + normalization (to reduce common transcript failure modes)
- NLU parsing into intents/slots (e.g., exercise, weight, reps)
- Optional cloud parsing (Gemini/OpenAI) for refinement when needed
- Command dispatch into the command bus
The guiding philosophy is: local heuristics first, cloud only as an assistive layer, and store the result as structured data in SQLite.
When cloud parsing is enabled, Ora pushes for JSON-structured outputs so the system stays robust. The goal is to transform natural language into something like:
{
"exercise": "Machine Chest Press",
"sets": [{"weight": 185, "reps": 8}]
}
4) The Ora Orb (floating input hub)
To make capture ubiquitous, Ora introduces a floating “Ora Orb” that acts as an always-available input hub across tabs:
- Camera, Upload, Mic, Text (2×2 deck)
- Drag-and-dock behavior with persistence
- Short “routing” state where inputs are classified and auto-navigated
This is human-centered design in practice: the user shouldn’t hunt for the right screen—Ora should meet them where they are and route intelligently.
5) Beyond training: diet + appearance (holistic but modular)
Ora expands beyond lifting logs into:
- Diet tracking (macros/micros, goals, time aggregation)
- Appearance tracking (measurements, notes, progress photos)
- Optional “analysis tasks” routed through an upload queue when cloud features are enabled
These are separate modules, but they share the same product principle: fast entry, clear visuals, and user-controlled privacy.
What I learned
Building Ora reinforced a few lessons that feel broadly applicable:
- UX is an engineering constraint. The best architecture is irrelevant if capture is annoying. The UI and the data model have to co-evolve.
- Determinism beats cleverness for daily workflows. Undo/redo, command logs, and predictable routing matter more than “smart” behavior that surprises the user.
- Voice interfaces fail in specific, repeatable ways. The biggest gains came from boring improvements: normalization rules, exercise aliasing, fuzzy matching, and validation.
- Local-first changes the entire product posture. Instead of building around accounts and sync, you build around resilience: offline, fast startup, stable storage, and explicit opt-in for cloud.
I also sharpened practical cross-platform skills: Flutter/Riverpod state boundaries, SQLite migrations, audio permissions and device quirks, secure key storage (Keychain/Keystore), and designing modular services that don’t collapse into a monolith.
Challenges we faced
Reliable voice-to-structure
Speech-to-text is easy to demo and hard to trust. The real challenge was translating partial, ambiguous phrases (“lat pulldown… 160… for eight”) into valid structured commands without breaking flow. The solution was layering:
- rule-based fixes for known errors,
- exercise matching against a catalog,
- optional cloud parsing when the local model is uncertain,
- and validation before writing to SQLite.
Cross-platform constraints
A feature that works on Android doesn’t automatically behave the same on iOS or desktop—especially audio capture, permissions, file paths, and background/foreground behavior. We had to be deliberate about “demoable everywhere” versus “best-in-class on one platform.”
Scope vs shipping
Ora combines training, diet, appearance, voice, and optional AI—easy to let that balloon. The hardest product decision was repeatedly narrowing the focus back to the core promise: frictionless logging.
Privacy without killing utility
Cloud features can add real value (clean parsing, summaries, photo analysis), but they come with trust costs. Ora’s stance is strict: local-first always, cloud opt-in only, and prefer sending text rather than raw media whenever possible.
A bit of math (because it’s useful)
Even simple analytics become meaningful when capture is consistent. For example, training volume can be computed as:
[ V = \sum_{i=1}^{N} w_i \cdot r_i ]
where (w_i) is the weight and (r_i) the reps for set (i). A simple activity score (useful for a “leaderboard” or habit reinforcement) can be framed as a weighted combination:
[ S = \alpha V + \beta N_{\text{sets}} + \gamma N_{\text{days}} ]
The point is not perfect physiology—it’s feedback that is clear and motivating, without forcing the user into a complex dashboard.
What’s next
Ora is intentionally built to scale in capability without losing its core identity. The roadmap direction is:
- Better structured parsing (schema-first, validation-first)
- Improved exercise matching (aliases, equipment context, per-user history bias)
- Smarter summaries (weekly deltas, progression suggestions, recovery flags)
- Optional on-device LLM support for private assistance
- Lightweight export workflows (PDF/CSV) that respect local-first principles
Built with (high level)
- Flutter (Dart) for cross-platform delivery
- Riverpod for scalable state management
- SQLite (
sqflite+ desktop FFI) for local-first persistence - Vosk for offline speech-to-text
- Optional LLM parsing via Gemini/OpenAI (JSON-structured outputs)
- Firebase Auth/Storage (optional) for opt-in cloud workflows
- Secure storage for API keys (Keychain/Keystore)
- SVG anatomy assets for muscle mapping and UI visualization
Github Link: https://github.com/Jibby2k1/Ora

Log in or sign up for Devpost to join the conversation.