Inspiration
One of our team members has spent a lot of time working at a rehab center and has seen these challenges firsthand. They saw how difficult it is for patients to stay consistent with home exercises and how little feedback they receive once they leave the clinic. That experience inspired us to build GaitGuard.
What it does
GaitGuard is a wearable rehab system that gives real-time personalized feedback on movement. A therapist first demonstrates the correct motion, the patient then performs it a few times, and the system builds a personalized digital twin of the patient's correct movement. During future reps, the system compares the patient's motion to that model and gives haptic feedback if the movement is off.
The key part is that this is not limited to walking. It can be adjusted to many different rehab exercises, which makes it useful for a wide range of physical therapy movements a therapist may want a patient to practice. That also means patients can take the device home, continue their exercises outside the clinic, and still receive automatic correction while they train.
How we built it
The system has three wearable sensor units. The main unit straps to the shin and houses an ESP32, an MPU-6050 IMU, a 3.7V 300mAh LiPo battery with charger, and a haptic vibration motor, all inside a custom 3D printed enclosure designed in SolidWorks and printed in PLA on a Bambu X1C. A second MPU-6050 in its own printed enclosure straps to the ankle, connected to the main unit through protected wire sheathing. The third unit is an ESP32 with a 1.69" touch display and built-in IMU that straps to the foot with its own battery. 2/3 of the units attach with Velcro for quick application and removal.
Each ESP32 samples its IMU at 50 Hz and streams raw accelerometer and gyroscope readings over WiFi to a Raspberry Pi 5 running QNX, which serves as the central processing hub. A complementary filter (α = 0.98) fuses those signals into stable joint angle estimates for the shin, ankle, and foot, correcting for gyroscope drift in real time. This Pi also serves a web page to view the live preview of the sensors.
The software pipeline runs in Python on the Pi. During calibration, the therapist performs several clean repetitions of the target movement. Each rep is automatically segmented, passed through a low-pass filter at 6 Hz to remove noise, and time-normalized to 100 data points so reps of different speeds can be compared fairly.
The core of the system is a two-layer stacked LSTM neural network built in PyTorch. Given the first portion of a new rep as input, the model predicts what the correct remainder of that movement should look like, trained entirely on the patient's own calibration data, not a population average. The deviation between the predicted trajectory and the actual sensor readings is computed at each time step and condensed into a Gait Health Score (GHS) from 0 to 100. When the score drops below a clinical threshold, the system sends a command back to the ESP32 and the haptic motor fires with a vibration pattern corresponding to the specific type of error.
Challenges we ran into
The biggest challenge was making the LSTM model work accurately with only a small amount of patient calibration data. Getting meaningful predictions from just a few reps required careful normalization and filtering so the model was not learning noise. Other challenges included sensor drift across the three IMUs over a session, and reliably synchronizing data transmission from multiple ESP32s over WiFi without dropped packets or timestamp mismatches that would corrupt the joint angle calculations.
Accomplishments that we're proud of
The accomplishment we are most proud of is the personalized digital twin. By comparing patients against their own therapist-prescribed movement rather than a population average, individual variation stops being a false positive. A movement that is perfectly normal for one patient is not incorrectly flagged just because it does not match a generic reference curve. We are also proud of the hardware. The number of components packed into each enclosure is significant, and the press-fit enclosures came out right on the first print. The form factor is practical, comfortable, and something a patient could realistically wear during a home exercise session. Getting the full pipeline running end to end, from raw IMU data all the way to a haptic buzz on the patient's leg, with latency low enough to feel immediate during a rep, was something we were not sure we could pull off in the time we had. Thankfully, we did.
What we learned
The most surprising lesson was that sensor fusion and drift correction matter just as much as the model itself. The machine learning is only as good as the data going into it, and getting clean, stable joint angles from cheap IMUs required more engineering effort than training the LSTM. We also learned that usability drives everything. Simple, distinct haptic patterns were far more effective than asking patients to watch a screen or interpret a score while exercising. The feedback has to be immediate, obvious, and require zero thought from the patient. Finally, we saw firsthand why personalization matters in rehabilitation. Population averages can misclassify normal individual variation as error. What looks like a deviation for one patient is perfectly correct movement for another. Building the system around each patient's own baseline eliminated that problem entirely.
What's next for GaitGuard
The next step is clinical validation through studies with physical therapists and rehab clinics, measuring patient outcomes against standard unsupervised home exercise programs. We want to move model inference fully on-device so the system works without any connected computer, and build a therapist-facing dashboard for monitoring patient progress between visits with session reports, score trends, and alerts when a patient starts struggling. Longer term, expanding the haptic pattern library and supporting a broader set of exercises would open GaitGuard up to a much wider range of outpatient rehabilitation.
Built With
- claude
- python
- qnx
- raspberry-pi
- zed
Log in or sign up for Devpost to join the conversation.