Inspiration

We were provided a foundational scene and 3D models to use as gameObjects (robots, weapons, artifacts) of a cyperpunk-dystopian theme. We were really excited to integrate all of the EMG, BCI, and VR themes of the hackathon due to the diverse interests of our teammates (studies: EE, pre-medicine, CS) and we wanted to learn all that we could so that we could better help each other and also make a cool project! Although the game functions are basic, we prioritized the features we agreed were most important to integrate for the user.

What it does

Although the app is fully-functional on any laptop, PEW connects to the Meta Quest 2 to provide an immersive VR game experience. Users can play handless by recording BCI and EMG signals live, which will be sent as data input to Unity using a UDP server connection; BCI signals are used to motionlessly kill robot targets by "focusing" on one robot (or accelerometer signals can be used if real-time EEG signals are too noisy to decode), and EMG signals to seamlessly switch between three different weapons that provide different attacks.

How we built it

PEW uses a UDP connection to input data into Unity after preprocessing BCI/EMG signal to send a UDP message from the client side (player) using a Python script within PhysioLabXR. The server side was implemented using C# as it is the only language supported by Unity. Details about our VR integration are in our GitHub README. The rest of our project was primarily focused on implementing features into the game, which included a UI displaying the current scoreboard and timer countdown. Additionally, we wanted to enhance the UX by adding sound effects to every player action, creating a more dramatic object death though death particles emission and ammunition.

Challenges we ran into

The first hurdle of the project was figuring out a way to reliably stream raw BCI and EMG signals to our own computers, by calibrating the given software and hardware. We wanted to utilize the recommended Lab Streaming Layer (LSL) to integrate g.tec's Unicorn BCI into our Unity project, but we had issues sending LSL messages to Unity. So, our group decided to try establishing a TCP connection instead-- which worked perfectly. Ultimately, due to its quicker and efficient delivery, we changed the connection method into UDP. We also had troubles decoding EEG signals due to noisy data.

Accomplishments that we're proud of

We are proud to have managed to take EMG input into the Unity app and create a robust corresponding response that is an action in the game itself.

What we learned

We learned how to use BCI technology to integrate actions into applications, to send information through UDP servers to connect the two interfaces, and to pay attention to small details that change the player's experience. We also learned how to utilize PhysioLabXR to send, receive, and alter LSL data streams using Python scripts.

What's next for PEW

We see the Neureality Game as an endless sea of opportunities such as including a player health bar, allowing the robots to attack you, changing scenes with different themes or levels, additional weapons and reloading feature, and additional UI effects such as glowing. We also intend to implement working decoding for BCI signals like SSVEP, ERP, alpha wave suppression, and ASSR.

Built With

Share this project:

Updates