Inspiration

Our inspiration for this project is Elon Musk's project Neuralink. Neuralink is a brain implant that reads signals from the brain to perform functions in the real world, but it is expensive, hard to implement, and overall a very risky maneuver. Our goal is to help gamers with disabilities with hand-eye coordinate, by removing the "hand" from the equation.

What it does

Unbound uses an OpenCV algorithm to control the movement of games, such as racing simulators. Moving your head in different directions steers the car in the intended movement. In addition, a buzzer is utilized to help users with synesthsia to process visual information as auditory and keep them on the race course.

How we built it

  • OpenCV for head detection
  • SuperTuxxKart (Open Source Game) (this is where we compiled the scripts)
  • Used an arduino nano and a buzzer to implement synesthetic auditory feedback
  • 3d printed a case for the circuit

Challenges we ran into

  • Integrating with open source games was challenging, as it wouldn't recognize gestures from the camera
  • Arduino IDE would throw issues when communicating through Serial Terminal
  • Time: The printer requires time to get parts printed, so prioritizing speed compromised print quality

Accomplishments that we're proud of

  • Integrating OpenCV to control the games (and computer) showed complexity in the project
  • Combining hardware and software
  • Making gaming accessible to aspiring gamers brings a sense of unity

What we learned

  • Managing time effectively is important. Since group members were taking part in the SAT, the initial down-time of full team commitment should be crucial to set up a strong base.

What's next for Unbound

Since we developed this project to work on open source games, it gives us a strong platform to help scale the development of Unbound to many other games. This wasn't simply built for an 8-hour hackathon, it was built to be evolved beyond today and years into the future.

Share this project:

Updates