Inspiration

What inspired us was our love of video games. We both love video games, and we feel like its become such an important part of our lives. We want developers lives to become easier, to provide them with more engaging tools that lessens the already massive workload on their plate. They need to focus less on the technical know how, and more on getting their idea off the floor.

What it does

Our client allows for two things. It allows for easy management of your game library. You simply can plug and play the games, with the different games you built, and you can copy and paste our boilerplate code into your games to add motion controls quickly and easily to any game.

How we built it

We built it using openCV as the main driver behind our project. With OpenCV and mediapipe, we can offload the heavy duty logic of tracking and landmarking the hands. Instead, what we do, is we take those landmarks and run geometric checks on them. These checks allow the software to get a sense as to the different gestures that a hand makes.

Challenges we ran into

Our biggest issue was learning how to implement mediapipe quickly and smoothly into our games. While we were not able to abstract away the logic completely, the module itself is independent of the game. Getting through mediapipe and learning how to use its landmark system proved difficult, but we were able to come out of it with a ton of knowledge.

Accomplishments that we're proud of

We're most proud of having a finished and working demo, something that is rare at times. Motion controls were integrated into our favorite game, and while not perfect, it works, and it feels good.

What we learned

We learned that the tech for handtracking has become so advanced, that it should not be limited to only the super technical anymore. Technically we learned how to use OpenCV, Mediapipe, and how to properly maintain and build databases using SQLite.

What's next for open_motion

Our next step is abstracting away the logic from the motion controls. If we can have that as a separate module, and have it simply be imported in a function or two would greatly reduce the strain of having to copy and paste our boilerplate code.

The following step would be integrating our DB module into our project. This would be a master list of gestures, one that could be iterated on and shared across the community, and could therefore allow for a shared network of hand gestures, and motion controls.

Lastly, we would want to add Machine Learning principles into our system. Our system is unstable at times, and so if we were able to build out a module that incorporated a classifier to determine what a hand gesture was, then we could make the gesture/motion experience even smoother, and higher quality.

Built With

Share this project:

Updates