Inspiration

Inspired by the popular Korean TV series 'Squid Game', this was our attempt to combine all our shared interests in mobile development, UI design, distributed systems, computer vision, and textiles to rebuild the doll in 'Red Light Green Light'.

What it does

Users can play the game through our mobile app where they can register themselves with a player number and headshot. The game then starts with a countdown and users are directed to line up 30m away from the doll. Players are allowed to move if the doll says 'Green Light', but must stop and freeze immediately when the doll says 'Red Light' in order to avoid elimination.

How we built it

Our project uses an event-driven WebSocket architecture to create a game that runs across three networked devices: a mobile app, a Raspberry Pi embedded inside the doll, and a laptop for computer vision. The Raspberry Pi acts like a central control hub, managing all inter-device WebSocket communication using predefined APIs, coordinating the sequencing of game events, and controlling the doll's physical components including a Bluetooth speaker, servo motors, and USB camera.

On the mobile side, we developed a cross-platform app for Android and iOS using Kotlin Multiplatform (KMP), and used Ktor to establish a WebSocket connection with Pi. The mobile app is responsible for things like player registration (including taking headshots), initiating the game, and updating each player's elimination status on the UI in real time.

For player elimination, we offloaded computationally-heavy tasks to a computer vision module running on a nearby laptop, built using Python and OpenCV. This module functions as a lightweight backend service, receiving video frames from the Pi and returning a list of eliminated players. The final CV algorithm can be thought of in two stages: 1) motion detection and 2) player-to-motion mapping. For motion detection, we used classical CV techniques such as background subtraction and frame differencing (comparing changes in pixels across multiple frames) and this worked really well : ) As for player-to-motion mapping, ideally this would have used facial recognition or nearest-neighbor analysis to match movement to registered players. However, due to us running out of time and being on very minimal sleep, we opted to use a hardcoded mapping of the relative lineup order of the players instead - which WORKED (for the most part)

Challenges we ran into

  • WebSocket connection on iOS being flaky
  • Synchronization of time-sensitive events on the Raspberry Pi
  • Handling player identification using classical computer vision
  • Deluding ourselves about the mechanics of our doll head, turns out it is very hard to turn a 5lb doll head with a very skinny servo rod/doll neck since the moment arm is non-existent
  • Being so tired you mistaken eczema steroid cream for toothpaste in the morning
  • Losing $10 to the vending machine
  • The common cold
  • Spending hours at night alone with the doll head

Accomplishments that we're proud of

  • Having a fully integrated multiplayer game with a mobile app, embedded hardware, and computer vision pieces all working within 24 hours : D
  • "Our daughter looks beautiful" - Amanda
  • Not getting hypothermia from sleeping on the cold concrete
  • Sewing an entire doll's outfit together with only cardboard and scrap fabric

What we learned

  • Working with WebSockets and async programming in Python/Kotlin
  • Designing clothing for a doll with recycled textiles
  • 3D printing a part in the shortest time possible

What's next for Red Squid Dead Squid

  • Get player ID and player-to-motion mapping code to work
  • Add toy gun mechanism to shoot eliminated players
  • Add battery system to make the doll portable

Built With

+ 14 more
Share this project:

Updates