FightStance: Real-Time Fighting Game Powered by Your Body

🚨## Inspiration## Obesity rates among children and adolescents are alarmingly high. In fact, the World Health Organization reports that over 39 million children under the age of 5 were overweight or obese in 2020. It’s no surprise — with the rise of “iPad kids,” screens are replacing sports, and sedentary lifestyles are becoming the norm.

But what if we could flip the script? What if video games — the very cause of inactivity — could become the solution?

🎮## What It Does## FightStance is an interactive fighting game that replaces traditional controllers with your own body. It tracks your movements — punches, kicks, jumps, and walking — using computer vision. Instead of tapping buttons, you physically perform the actions to control your character, letting you get in a full workout in playing a street fighter game!

It’s fitness disguised as fun.

All it takes is: A camera, a Raspberry Pi, a laptop ...and you’re ready to fight your way to fitness, whether you're at home or on the go.

🛠️## How We Built It## Let’s get technical.

We used:

QNX – A real-time operating system with deterministic performance, perfect for embedded applications and performance-heavy systems.

OpenCV – A powerful, open-source computer vision library for processing camera input.

MediaPipe – Google’s framework for real-time pose estimation and landmark detection using machine learning.

Why QNX? We wanted to optimize for portability and real-time response. QNX is often used in automotive and mission-critical systems — so why not for a fighting game?

Why Python OpenCV and MediaPipe? MediaPipe’s cutting-edge pose estimation works seamlessly with Python’s OpenCV bindings. Together, they allowed us to extract key body landmarks with high accuracy in real time.

Using Unity, we created the 2D fighting game. We added artwork and animations from the Unity Asset Store, and coded the combat logic and mechanics from the ground up! We configured motion detection by mapping body landmarks — for example:

If the wrist rises quick above the right hip → it's a punch. If one leg creates a greater than 45 degree angle with the other → it's a kick. We also determine the direction (left or right) based on wrist position relative to body center!

We establish a baseline “standing” distance between ankles. When that distance suddenly increases, it’s likely a kick. We determine kick direction based on which hip is closer to the camera (using depth from the z-coordinate).

Walking is recognized by detecting alternating movement in the knees and ankles. If knees and ankles show rhythmic up-and-down motion, and it continues across multiple frames, we infer a walking action. Using the z-position of the hips, we calculate walking direction (left or right).

Jumping is captured by monitoring the vertical velocity of both feet. We track the Y-coordinates of the left and right ankles. If both feet move upward quickly at the same time, and velocity exceeds a threshold → it's a jump.

We calculate an average foot velocity across a sliding window of frames to avoid false positives. All of this was coded and calibrated manually by analyzing the X and Y coordinates of the user’s joints — giving the game precise, gesture-based control.

But it wasn’t that easy...

Challenges Faced

The challenges we faced could be explained in 3 syllables: QNX

Unlike Linux or Windows, QNX doesn't provide straightforward support for Python or OpenCV out of the box. The MediaPipe library depends on Python-based OpenCV, while QNX natively supports only C++ OpenCV.

We initially were trying to run Python OpenCV on our Raspberry Pi, but QNX proved itself to be quite the nuisance. After spending over 20 hours trying to get the damn thing to work, we decided on a compromise: run the OpenCV in tandem with the Unity and just use the Raspberry Pi as an external webcam on QNX.

🏆 Accomplishments That We're Proud Of

Designing a polished looking Unity game in just 36 hours! (even if the code was messy)

Successfully integrating real-time body motion tracking into an interactive Unity game.

Surviving 36 stressful hours on nothing but caffeine and large language models!

Designing a fun, fitness-friendly interface for both kids and adults.

Creating a portable, lightweight system that runs on a laptop and a Raspberry Pi — no bulky gear required!

📚 What We Learned

How to manually compile and link libraries in constrained environments.

Deep integration between Python + OpenCV + MediaPipe for real-time motion analysis.

The importance of balancing performance and compatibility in system design.

How to design intuitive, engaging game logic based on human biomechanics.

🚀 What’s Next for FightStance

We plan to:

Include calorie burn estimation and fitness tracking for a health-focused twist.

Launch a mobile version that connects with just a phone camera — no external gear needed.

Build a library of mini-games (boxing, sports, adventure) to keep things fresh.

FightStance isn't just a game. It's a movement. One that encourages kids to move, laugh, sweat — and fight back against sedentary living.

Let’s get up. Let’s throw some punches. Let’s make playtime active again.

Built With

Share this project:

Updates