Inspiration

Dual goal: inclusivity and entertainment

Inclusivity

About 15% of the world's population lives with some form of disability according to Disability reports from the WHO on 2011. Our team is committed to develop inclusive, AI-based solutions to upgrade the living conditions of those who need it the most. In particular, we will focus on arm-related disabilities.

Entertainment

Ever thought of controlling Mario-Bros with your hands? Forget about joysticks, and become the joystick yourself of your favorite games by simply guiding your Avatar through the movement of your extremities! Not only we want to join the real-experience gaming-revolution, but also approach interactive tools even for those suffering disabilities providing a more user-friendly experience.

What it does

Our team has used an electromyogram, which is a device capable of monitoring the differences in the electric signal generated by the contraction of muscles. Through a machine-learning algorithm, several types of movements can be identified real-time simply by means of such electric impulse. To test its manifold potential applicability, not only we did incorporate the model onto an Arduino-based robotic arm, but we also designed a videogame interface where the control of the players is done through the movement of the extremities.

How we built it

Electromyogram & Arduino

By means of Arduino-based electromyogram and electrodes, we captured the electric signal generate from muscles associated gestures. After several tests to identify the optimal locations to place the electrodes on the user arm, which technology to use, any interference or noise generated on the electric pulse wave, we managed to record, store, and decode the signal for a wide variety of gestures from two electromyograms (biceps and triceps). Notably extension and contraction of the arm, twisting and clenching the fist.

Generating our dataset

Combining Arduino and Python, we succeeded in building our dataset of signals. One member of the group tried 50 times each of the 4 gestures, gathering around 223 samples in total. To assess the validity of this dataset taking into account that we only counted with a single patient and that all samples were recorded at the same period of time simply using two signals, an open-source dataset of more than 6 million samples from more than 36 users and up to 8 electromyograms (8 signals).

Machine Learning for gesture identification

Signals are temporal series of data. We have simultaneous information from different electromyograms monitoring differences in electric pulse. To identify the gesture (a class) from a given set of signals, relevant features from a time window has been extracted (post-processing of the dataset) such as burst time, maximum value, minimum value, average, standard deviation, total power, median frequency, maximum power. These features are processed through a Machine Learning algorithm, being Random Forest the one who provided better results.

Both datasets, the open-source and the self-generate from Arduino Electromyogram, were tested using a 30-70 split. While the testing with the open-source one reached 87,7% accuracy in unseen samples, the self-generated dataset reached 98,5%. The self-generated dataset only has data form a single user at a single time space, while the open-source data contains 36 people and 2 samples of different days for each user, implying higher variability of data and making it more generalist and applicable to a daily basis, which illustrates the differences in accuracies.

Robotic-arm

By means of coordinating python and Arduino codes, we managed to communicate Arduino servos of a robotic-arm (already built before the hackathon) and associate its movements to the outputs of the gesture-predicter.

The game

This game resembles a retro-style 2D game where 2 players fight each other, like in the Street fighter series. Each one of them has a health bar and 1 minute to defeat the other.

Both players have 4 kind of movements:

  1. Moving right
  2. Moving left
  3. Jump
  4. Attack

The player 1 can play using the AWSD keys and the player 2 uses the arrow keys.

Our main objective has not been to create our own game but to use our trained algorithm to detect different movements of the body and use them as “buttons” to trigger different behaviours of the player 1 character on the screen.

Challenges we ran into

The first challenge that we had to face at the start of the hackUPC was that a member of the group was positive in COVID so we have been working in a hybrid mode, making sometimes to work in a slower pace for not being able to communicate in an instant way.

Due to some security related topics, JavaScript is not able to load local files from the client side, so we had to emulate a client-server topology to be able to read the file where all the generated data created by our algorithm would have been stored in.

Real-time data generation created by the electromyogram has been a problem as well, so we have created a small Python program that simulates the real-time output we couldn’t accomplish in order to make the game run as if the real data was given by a person’s movement.

Accomplishments that we're proud of

Even though we have not been able to work all together in the same physical place, we have manage to communicate fluently during the whole hackathon and show the progress to the team members and work towards the same direction.

As we had no experience in developing the kind of games, we decided to use an existing tutorial to learn how to create a web-based game developed with HTML, CSS and JavaScript.

What we learned

We broad our knowledge on Arduino, communication between python and Arduino codes, tips on electromyogram signals, and the ability to develop a video game.

What's next for Reaching the unreachable

Get real time Expanding the project for a wider range of users and increase the number of features on the video game.

The game can be improved by adding sound to it, touch control for smartphones, controller compatibility, make it resizable to any screen, adding a small tutorial expanding the roster of playable characters, adding VFX and other cool features.

Share this project:

Updates