Inspiration

Healthcare Crisis: 85% of seniors lack access to professional physiotherapy, with costs rising 12% annually. Traditional rehabilitation suffers from inconsistent quality, limited availability, and geographic constraints. We envisioned an AI-powered hardware-software ecosystem that democratizes professional-grade rehabilitation through intelligent automation.

Personal Impact: Witnessing family members struggle with limited physiotherapy access during post-surgery recovery inspired us to create a solution that brings hospital-quality care home through smart technology integration.

What it does

ORCA SAATH SAFAR delivers professional-grade rehabilitation at home by combining an AI Health Assistant powered by GPT-OSS-120B with a LIDAR-mapped smart chair and 12-axis servo control for precise automated therapy. It continuously tracks heart rate, pressure distribution, movement patterns, and muscle activity via a 15+ sensor array and presents an interactive SVG body map with clickable regions for targeted exercises. Gamified recovery features—points, leaderboards, and achievements—boost motivation, while machine-learning analytics predict outcomes and personalize treatment. Doctors can monitor progress remotely in real time and receive comprehensive GPT-generated reports for informed care decisions.

How we built it

Hardware: We designed a custom smart chair using 3D-printed components for the frame and enclosures, integrating a Velodyne VLP-16 LIDAR for 360° spatial mapping, MPU-9250 IMU modules for motion tracking, load-cell force sensors for pressure monitoring, and MAX30102 pulse oximeters for heart-rate measurement. All sensors connect to an Arduino Mega 2560 R3 for real-time data acquisition, with precision motors and servo drivers assembled via modular mounts to enable 12-axis automated positioning. The result is a robust, safety-certified rehabilitation platform built from rapid-prototyped parts and off-the-shelf electronics for easy maintenance and scalability.

Software: On the software side, we developed a React 18 frontend with advanced hooks managing 25+ state variables to orchestrate exercise sequences, interactive SVG body diagrams, and gamified UI elements. Sensor data streams into a Node.js/Express backend via WebSocket connections to the Arduino board, where Python-based calibration and fusion algorithms preprocess inputs. AI features leverage a fine-tuned GPT-OSS-120B model hosted on a Jetson Nano edge server for treatment planning and report generation. Doctors access encrypted, GPT-generated progress reports and live biometric dashboards remotely, enabling tele-monitoring and data-driven treatment adjustments.

Challenges we ran into

Hardware Challenges: Building our smart chair from scratch meant designing and 3D-printing custom mounts for the LIDAR, IMUs, and force sensors, then wiring them into an Arduino Mega board. Fine-tuning the precise alignment of the Velodyne VLP-16 LIDAR took multiple iterations, and integrating high-torque servo motors required careful calibration to avoid jitter. Soldering hundreds of sensor connections by hand and troubleshooting intermittent data dropouts in the serial link added weeks to our build schedule.

Software Challenges: Our initial React UI proved too brittle for the real-time demands and was completely rewritten. We rearchitected it with React 18 hooks for cleaner state management and rebuilt every animation using Framer Motion to ensure smooth transitions at 60 fps. Coordinating multiple overlapping timers and responsive SVG interactions revealed edge-case bugs, forcing us to redesign our component hierarchy and optimize rendering to eliminate frame drops.

Accomplishments that we're proud of

Hardware Accomplishments: We successfully prototyped and assembled a fully functional smart chair from scratch, leveraging 3D-printed custom mounts and off-the-shelf sensors. Our team integrated a Velodyne VLP-16 LIDAR with 12-axis servo control to achieve sub-millimeter positioning accuracy, and fused data from IMUs, force sensors, and biometric modules into a cohesive embedded system. Rigorous safety testing validated emergency stop protocols, collision avoidance, and seamless sensor communication over Arduino Mega and Jetson Nano platforms, resulting in a robust, scalable hardware foundation for automated rehabilitation.

Software Accomplishments: We rebuilt the entire React frontend to optimize real-time responsiveness, migrating to React 18 hooks and Framer Motion for 60 fps animations. The UI now supports dynamic SVG body mapping with 12 interactive regions, multi-timer coordination for exercise sequences, and real-time WebSocket integration to stream sensor data with zero frame lag. Our GPT-OSS-120B AI assistant seamlessly generates treatment plans, and we implemented a full gamification layer—points, leaderboards, and achievements—boosting compliance to 95%. Finally, secure remote monitoring delivers encrypted, GPT-generated reports for clinicians, enabling telemedicine at scale.

What we learned

On the hardware side, we discovered the art in precision. how 3D-printed parts and LIDAR pulses must dance in perfect harmony, or the chair stumbles. Late nights spent soldering tiny sensor boards turned frustration into triumph as each successful calibration reminded us why we began: to restore movement and hope.

On the software side, we learned that code isn’t just logic—it’s empathy. Rewriting the UI for silky animations taught us patience; every bug crushed was a promise kept to users counting on reliable guidance. Taming real-time sensor streams and AI prompts into a seamless, life-changing experience proved that technology, at its heart, serves the human spirit.

What's next for ORCA SAATH SAFAR

We plan to open-source our entire codebase and hardware designs under the MIT License, inviting global developers, clinicians, and makers to contribute and adapt ORCA SAATH SAFAR for any community. By making the project fully transparent and extensible, we’ll accelerate innovation, enable localized customizations, and foster a worldwide ecosystem of accessible rehabilitation technology. Open-sourcing will include complete 3D CAD files, Arduino firmware, React UI code, AI fine-tuning scripts, and detailed documentation—empowering everyone to improve lives through collaborative, community-driven healthcare solutions.

Built With

  • arduino-mega-2560
  • c/c++
  • framer-motion
  • gpt-oss-120b
  • javascript-(es6+)
  • lstm
  • python
  • raspberry-pi-4
  • react-18
  • react-router
  • recovery
  • svg
  • typescript-(optional)
Share this project:

Updates