We wanted to nudge a teammate’s New Year’s fitness resolution and turned it into a playful “run around NUS” experience.
Classic runner vibes (think Subway Surfers) but with campus-specific jokes, MC/SU collection, and a surprise “Professor” boss that requires the 6-7 hand gesture.
What it does
Endless runner through NUS: dodge tables, crowds, peacocks, and the NUS bus while collecting MCs and SUs toward a 160 MC graduation goal.
Boss battle swaps keyboard mashing for computer-vision hand gestures: alternate left/right hands to land 67 hits and defeat the professor.
Dynamic background playback (upload a “lecture” video), perspective calibration, and retro UI overlays tuned to the campus theme.
How we built it
Frontend: React + Canvas (custom renderer) with Vite tooling; retro pixel art and lane/perspective projection logic.
CV/gesture: MediaPipe Hands in the browser (JS) for the boss fight; Python/MediaPipe Tasks + OpenCV harness in hand detection_v2/ for offline gesture testing and tuning.
Game systems: lane movement, collision, obstacle spawner, boss timer/state machine, and leaderboard persistence via localStorage.
Challenges
Integrating MediaPipe hand tracking reliably across browser and Python harnesses (GPU/GL issues on macOS; fallback to CPU).
Aligning lane perspective to arbitrary uploaded videos while keeping gameplay readable.
Keeping gesture detection intuitive in mirrored (selfie) mode and mapping hands consistently.
Accomplishments
Working gesture-driven boss fight replacing the old key-spam mechanic.
Campus-flavored obstacles, tokens, and UI polish that sells the NUS theme.
Tooling to calibrate perspective and separately test gestures (JS in-app + Python harness).
What we learned
MediaPipe Hands/Tasks quirks across platforms and how to debug GL/CPU delegation.
Log in or sign up for Devpost to join the conversation.