Inspiration

Accessibility controllers exist, but they all make the same assumption: that you can use your fingers. The Xbox Adaptive Controller just remaps buttons to bigger buttons. Eye trackers replace joysticks but add a whole camera setup. Switch access devices need mounting hardware, calibration rigs, and more time to set up than most people spend actually playing. Every solution adds more stuff while still assuming you have fine motor control in your hands.

For players with limb differences, tremors, or limited hand mobility, the problem was never which button to press. It's that buttons were never designed for them in the first place.

OmniBlade gets rid of buttons entirely. Strap it on, hold still for 3 seconds, and play. If you have an arm, you can play.

What it does

OmniBlade is a motion-based adaptive game controller that replaces every input with natural arm gestures. Tilt forward to move, tilt sideways to strafe, flick your wrist to attack, dodge, or interact. No buttons. No joysticks. Just motion.

We demoed it live in Dark Souls III, Hades, and Forza Horizon 5 — three completely different control schemes running off the same hardware.

How we built it

The hardware stack is an ESP32 DEVKIT V1 paired with an MPU-6050 6-axis IMU, mounted on a custom 3D printed forearm brace. The ESP32 reads accelerometer and gyroscope data over I2C at 50Hz and runs a complementary filter to fuse both signals into stable roll and pitch angles. Flick gestures are detected via gyroscope spike thresholds with cooldown logic to prevent false triggering.

Data streams wirelessly over Bluetooth Serial to a Python script on the host PC, which maps tilt angles to WASD keypresses and flick events to virtual Xbox gamepad buttons and mouse clicks, giving us keyboard, mouse, and controller output simultaneously.

Hardware Components

  • ESP32 DEVKIT V1
  • MPU-6050 6-axis accelerometer and gyroscope
  • 3D printed forearm mount and enclosure
  • Portable USB power bank
  • Jumper wires

Challenges

Getting stable orientation from raw IMU data was the core problem. The accelerometer alone is too noisy. The gyroscope alone drifts. The complementary filter balances both, trusting the gyro for fast motion and the accelerometer for slow corrections. Tuning the alpha value and flick thresholds to feel natural without false triggering took significant iteration.

Mapping gestures to a game as input-dense as Dark Souls III with only 4 flick directions forced hard design decisions about which inputs actually matter.

What we learned

I2C communication and sensor fusion on embedded hardware, complementary filtering for IMU orientation estimation, Bluetooth Serial on ESP32 for wireless data streaming, virtual gamepad, keyboard, and mouse simulation on Windows via Python, and how to design gesture vocabularies around physical constraints.

What's next

The current build only scratches the surface. We want to swap the complementary filter for a Madgwick filter for better orientation accuracy under fast motion, add a second IMU on the upper arm so we can detect relative motion between segments, and train a gesture classifier that goes way beyond 4 flick directions. The end goal is a controller expressive enough to handle any game, for any player.

Built With

  • elevenlabs
  • esp-32
  • imu
  • pynput
  • python
Share this project:

Updates