stride
Inclusive, audio-first safety — built during a hackathon
Quick links
- Inspiration
- What it does
- How we built it
- Challenges we ran into
- Accomplishments that we're proud of
- What we learned
- What's next for Stride
Inspiration
The fall detection market is projected to reach $2.3 billion by 2028, yet it's failing 7.5 million visually impaired individuals in North America alone. Existing solutions like Apple Watch and Life Alert assume users can see screens and navigate apps. Meanwhile, visually impaired adults experience falls at twice the rate of sighted individuals, with 30% suffering serious injuries.
We saw a massive market gap: accessible safety technology doesn't exist at scale. Stride was born from recognizing that inclusive design isn't just ethical—it's an untapped market opportunity with clear product-market fit.
What it does
Stride is the first fall detection wearable built specifically for the visually impaired market. Our device provides:
Core Value Proposition:
- Real-time fall detection using multi-axis motion sensors
- Real-time phone calls and SMS to alert your caregiver!!!
- Zero-vision interface through audio announcements and haptic feedback
- Automatic emergency alerts to designated contacts
- Standalone operation with no smartphone dependency
- Companion caregiver app for remote monitoring
Technical Features:
- Continuously monitors movement patterns using accelerometer and gyroscope sensors
- Automatically detects when a fall occurs through motion analysis
- Provides immediate audio announcements and haptic (vibration) feedback to communicate with the user
- Sends automatic alerts to emergency contacts or caregivers
- Works independently without requiring a smartphone or visual interface
- Uses simple tactile buttons for user control and acknowledgment
Competitive Advantage:
Unlike Apple Watch or medical alert systems, Stride is purpose-built for accessibility. Every interaction is audio-first, every feature works without vision, and the entire user experience centers on independence rather than dependence on caregivers.
How we built it
MVP Development Strategy:
We prioritized speed-to-market with an Arduino-based prototype that proves core functionality while minimizing development costs.
Hardware:
- Arduino Nano 33 BLE Sense Lite with built-in 9-axis IMU (accelerometer, gyroscope, magnetometer)
- Limitations: no wifi connection - must be plugged into computer (demo limitation), an ideal board would be one that we could have established bluetooth connection with
Software:
- Motion detection algorithms analyzing accelerometer data for fall patterns
- Person detection ML Model to detect if there is a person near you -> communicates this through app
- Blockers see if there is a blocker in your path, will alert the user!
- Arduino C++ for device programming and sensor integration
- Bluetooth Low Energy (BLE) for wireless connectivity to caregiver app
- Mobile companion app (React Native) for caregiver notifications and monitoring
Overview of Tech
A sensor-to-notification pipeline: an Arduino detects falls and nearby presence, broadcasts events over BLE; a gateway (native or web) forwards them to an API and triggers UI actions like emergency calls and SMS via a Twilio microservice.
End-to-end flow
- Arduino sensor node → BLE notifications (fall/proximity events)
- Gateway app (choose one):
- Mobile native BLE gateway (
gateway– React Native/Expo) - Web Bluetooth gateway (
mobile-app→BleGateway)
- Mobile native BLE gateway (
- HTTP POST → Next.js API (
next→/api/fall) - UI actions in
mobile-appdashboard (auto-call, share location) - Voice/SMS via
twilio-server
Components and what they do
Arduino node (
arduino)- Continuously reads IMU, microphone, and barometer.
- Runs a state machine to detect impacts, motion inactivity, loudness bursts, and a “lying” posture check.
- Emits compact JSON events over a custom BLE service and characteristic:
- Fall events include timestamp and sensor-derived features.
- Proximity events indicate “person in front” when a pressure press is detected.
Native BLE gateway (
gateway)- React Native app (Expo) that scans for the Arduino, connects, subscribes to notifications, decodes messages, and forwards each event to the API.
- Also logs events locally for debugging.
Web Bluetooth gateway (
mobile-app→src/components/BleGateway.jsx)- Browser-based central that pairs to the Arduino via Web Bluetooth, subscribes to notifications, forwards events to the API, and emits callbacks for the UI.
- Useful on desktop Chrome/Edge or localhost HTTPS scenarios without installing a native app.
UI and app shell (
mobile-app)- Vite-powered React app with screens for loading, login, device setup, and a dashboard.
- In “gateway mode” (query param), it runs the Web Bluetooth gateway view directly.
- On fall events: highlights emergency state and (optionally) triggers a call workflow.
- Quick actions:
- Place an emergency call (routes to Twilio server).
- Call a designated caregiver.
- Share current location (browser geolocation + SMS via Twilio).
API layer (
next)- Next.js route at
/api/fallaccepts event POSTs from either gateway. - CORS-open stub ready for future processing (persistence, analytics, alerting, webhooks).
- Next.js route at
Telephony/SMS microservice (
twilio-server)- Minimal Express server for outgoing voice calls and SMS.
- Used by the dashboard quick actions and auto-call after fall detection.
How they interact at runtime
- A fall is detected → Arduino broadcasts a BLE notification.
- The gateway (native or web) receives it, forwards the event to the Next.js API, and signals the UI.
- The dashboard auto-initiates an emergency call and lets the user send their live location via SMS.
- All along, the API can be extended to log events or fan out alerts.
Data and connectivity
- BLE: Arduino ↔ Gateway (notifications on a custom service/characteristic).
- HTTP: Gateway → Next.js API for event ingestion.
- HTTP: UI → Twilio server for calls/SMS.
- Geolocation: Browser collects coordinates for SMS location sharing.
- Firebase:
mobile-appinitializes Authentication and Firestore for future auth/data use.
Key Technical Approaches:
1. Threshold-based Fall Detection
We implement a multi-threshold algorithm that analyzes acceleration magnitude:
$$a_{total} = \sqrt{a_x^2 + a_y^2 + a_z^2}$$
Where $a_x$, $a_y$, $a_z$ are accelerometer readings on each axis.
A fall is detected when:
- Free-fall threshold: $a_{total} < 0.5g$ for $t > 100ms$
- Impact threshold: $a_{total} > 3.0g$ for $t > 50ms$
- Post-impact stillness: $a_{total} \approx 1.0g$ for $t > 2s$
2. Multi-axis Motion Analysis
We distinguish falls from normal movements using angular velocity:
$$\omega_{total} = \sqrt{\omega_x^2 + \omega_y^2 + \omega_z^2}$$
3. Edge Impulse to train AI model to detect person (integrated w/ Arduino)
Architecture:
---
┌─────────────────────────────────────┐
│ Arduino Nano 33 BLE Sense │
│ ┌───────────────────────────────┐ │
│ │ 9-axis IMU Sensors │ │
│ │ • Accelerometer │ │
│ │ • Gyroscope │ │
│ │ • Magnetometer │ │
│ └───────────┬───────────────────┘ │
│ │ │
│ ┌───────────▼───────────────────┐ │
│ │ Fall Detection Algorithm │ │
│ │ • Threshold analysis │ │
│ │ • Motion pattern matching │ │
│ │ • State machine logic │ │
│ └───────────┬───────────────────┘ │
│ │ │
│ ┌───────────▼───────────────────┐ │
│ │ Output Controllers │ │
│ │ • BLE notifications │ │
│ └───────────────────────────────┘ │
└─────────────────┬───────────────────┘
│ BLE
│
┌────────▼─────────┐
│ Companion App │
│ (React Native) │
│ • Caregiver UI │
│ • Alert system │
│ • Data logging │
└──────────────────┘
Challenges we ran into
Technical Challenges:
1. Calibrating Fall Detection Sensitivity
Problem: Distinguishing true falls from normal movements (sitting down, bumping into objects, rapid direction changes)
Solution: Implemented multi-stage detection with three conditions:
- Free-fall phase: $a_{total} < 0.5g$
- Impact phase: $a_{total} > 3.0g$
- Post-impact stillness: $\Delta a < 0.2g$ for $t > 2s$
This reduced false positives by ~$60\%$ compared to single-threshold detection.
2. Knowledge
Problem: First ever hardware hack and so the board we have is a Nano 33 BLE Sense Lite and we mistakenly took that for Nano 33 BLE Sense, however the Lite has several less features and compatibilities which limited our original plan
Solution:
- Implemented react app instead of connecting to phone so we at least had a viable dashboard
- Used Twilio to make real-time phone calls to emergency contact from react app
- Achieved MVP after hours of youtube videos and documentation :)
3. Audio Clarity
Problem: Ensuring mic provides clear alerts without overwhelming sensory load
Solution:
- Testing and iteration
4. Wearable Form Factor
Challenge: Housing components on the body was simply not possible with this hardware
Approach: Demo with it plugged in -> but it is clear this can be made into a new wearable specifically with features to support visually impaired
5. Testing Limitations
Challenge: Limited access to actual visually impaired users during hackathon
Mitigation: Followed WCAG accessibility guidelines and consulted accessibility best practices
Market Challenges:
- Regulatory pathway: Understanding FDA clearance vs. consumer wellness positioning
- Go-to-market strategy: Identifying optimal channels (B2C, B2B, insurance)
- User research: Building without extensive validation from target users
Business Model Questions:
- Hardware margins vs. subscription revenue mix
- Manufacturing partner selection for scale
- Distribution channel strategy
These challenges validated our startup approach: build, test, iterate quickly rather than over-engineering before market validation (which is crucial for real product engagement!)
Accomplishments that we're proud of
Product Achievements:
✓ Built a fully functional device in just <24h hours that works independently without a smartphone and first time using hardware
✓ Achieved ideal fall detection accuracy in preliminary testing
✓ Implemented an audio with $ with good response time
✓ Implemented real-time motion analysis
✓ Designed truly accessible UX
✓ Successfully integrated multiple sensors (9-axis IMU, audio, haptic) into compact form factor
✓ Developed companion app with BLE connectivity for caregiver monitoring
✓ Demonstrated accessibility technology can be both functional and empowering
Strategic Positioning:
- Extensible platform (health monitoring, navigation assistance, medication reminders)
- Built relationships with accessibility organizations during research
- Generated interest from potential beta users and distribution partners
What we learned
Technical Learnings:
1. Accessibility is Design Philosophy, Not Features
Building for visually impaired users required first-principles thinking:
- Can't retrofit visual interfaces—must design from ground up
- Every interaction must answer: "Does this work without vision?"
- Accessibility constraints drove better design for all users
2. Audio and Haptic Feedback are Powerful
Non-visual communication is highly effective when designed properly
3. Motion Detection is Complex
Distinguishing intentional movements from falls requires:
- Multi-threshold analysis
- Temporal patterns: Duration matters as much as magnitude
- Context awareness: Post-impact stillness confirms falls
- Tuning is critical: Each threshold affects accuracy vs. false positive tradeoff
4. Play around with hardware more
Market Insights:
1. Accessibility Market is Underserved and Willing to Pay
- Existing solutions: have limited functionality
2. B2B Distribution is Key
Organizations serving visually impaired communities:
- Eager for better solutions (current options inadequate)
- Can drive adoption through recommendations
- Provide validation and credibility
3. Caregiver Anxiety Drives Purchasing
- Peace of mind is primary value proposition for caregivers
- Remote monitoring justifies subscription pricing
- Emergency alerts reduce caregiver burden
4. Niche Markets Can Scale
- North America: $7.5M$ visually impaired adults
- Global: $285M$ visually impaired individuals
- Adjacent markets: Elderly ($55M$ in US), mobility-impaired, dementia patients
- Platform potential: Fall detection → comprehensive independence support
Startup Strategy:
1. MVP Validates Faster Than Perfect Products
Our rough prototype generated more interest than polished mockups:
- Tangible demos prove feasibility
- Working hardware > slide decks
- Early feedback guides development better than assumptions
2. Impact and Profit Align
Solving real problems creates sustainable businesses:
- Mission-driven attracts talent and customers
- Social impact opens grant funding and partnerships
- Underserved markets have less competition
3. Hardware + Software Creates Defensibility
- Pure software is easily replicated
- Integrated solutions have technical moats
- Physical products have manufacturing barriers to entry
What's next for Stride
Immediate Next Steps (0-3 months):
Technical Development:
Business Development:
- [ ] Apply for grants/accelerators:
- Target: Accessibility-focused programs
- Healthcare innovation funds
- University entrepreneurship programs
- [ ] Build strategic partnerships:
- National Federation of the Blind
- Lighthouse organizations (local chapters)
- Assisted living facilities (pilot partners)
- [ ] Secure pre-orders: Validate demand w
- [ ] Refine business model: Determine hardware vs. subscription pricing mix
12-Month Roadmap:
Product Development:
- [ ] Enhance ML model
- Personalization: Learn individual movement patterns
- Target accuracy
- [ ] Add health monitoring features:
- Heart rate variability (HRV) detection
- Activity tracking and sedentary alerts
- Medication reminders via audio prompts
- [ ] Expand companion app:
- Historical data visualization
- Trend analysis and reporting
- Multi-caregiver support
- Integration with emergency services (911 auto-dial)
Manufacturing & Scale:
- [ ] FDA clearance or wellness device positioning: Define regulatory pathway
- Quality certifications: ISO 13485 (medical devices)
- Collect real-world usage data
- Validate reliability and user satisfaction
- Generate testimonials and case studies
Research References
Jin et al. (2024) - Association between vision impairment and increased prevalence of falls in older US adults
- Journal: Journal of the American Geriatrics Society
- URL: https://agsjournals.onlinelibrary.wiley.com/doi/10.1111/jgs.18879
PMC Study on Risk Factors - Risk factors of falls in elderly patients with visual impairment
- Journal: PMC
- Longitudinal study (2019-2021) of 251 elderly patients
- URL: https://pmc.ncbi.nlm.nih.gov/articles/PMC9441862/
JMIR Aging (2025) - The Impact of Vision Impairment on Self-Reported Falls Among Older US Adults
- Cross-sectional and longitudinal study using Health and Retirement Study data (1996-2020)
- URL: https://aging.jmir.org/2025/1/e68771
BMC Geriatrics (2022) - Visual risk factors for falls in older adults: a case-control study
- Case-control study with 83 falls participants and 83 non-falls participants
- URL: https://bmcgeriatr.biomedcentral.com/articles/10.1186/s12877-022-02784-3
BMC Public Health (2022) - Visual impairment and falls among older adults and elderly: evidence from longitudinal study of ageing in India
CDC MMWR (2016) - Falls Among Persons Aged ≥65 Years With and Without Severe Vision Impairment
Eye Journal (2010) - Visual loss and falls: a review
- Comprehensive review of vision and falls literature
- URL: https://www.nature.com/articles/eye201060
Stride: Moving forward with confidence, together.
Built With
- android
- arduino
- authentication
- ble
- c/c++
- css
- expo.io
- express.js
- firestore
- html
- javascript
- json
- jsx/tsx
- kotlin
- next.js
- node.js
- proguard
- react-native-ble-plx
- sdk
- twilio
- typescript
- vite
- xml



Log in or sign up for Devpost to join the conversation.