Project Description: What problem are you solving? Which programming languages, tools, or platforms did you use, and how?
NeuroReach is a brain–computer-interface controlled robotic arm that allows users to perform physical movements using EEG signals. The system detects neural patterns associated with intentional movement and translates them into actions of the robotic arm. This allows hands-free control and shows how neural signals can be directly linked to real-world physical interaction.
The project was built using an Electroencephalogram (EEG) headset to capture brain signals, which are then processed and classified using a machine learning model (specifically a Support Vector Machine Model) using the Python programming language. Based on the predicted intent, commands are sent to the robotic arm to perform the corresponding movement. The solution to ensuring reliable communication between the software and the robotic arm was to use a Raspberry Pi communicating with my brain computer interface through IP addresses.
Purpose: Why did you choose this problem? Share your motivation and explain why your project matters. How could it improve lives or make an impact if developed further?
The inspiration behind NeuroReach came from my interest in brain–computer interfaces and their medical applications, specifically in assisting individuals with paralysis. Many assistive devices today are expensive or limited in functionality. I wanted to create a system where brain signals could be used in a more practical and accessible way to restore basic movement and independence.
The specific reason why I am targeting people with paralysis is that I recently visited a disabled care center, called Singapore Enable, and it frustrates me that such amazing people can't do tasks such as moving their hands around. With the long-term goal of increasing accessibility and real-world impact, I also aim to make the system more adaptable to different users and explore ways to make the hardware more affordable.
How it Works: What can users do with your project? What are the key features or user stories? Did you use any datasets or external resources?
The whole system is really easy to interact with since it requires no calibration, and all the user needs to do is think about the hand movement, and the exoskeleton arm will respond in real time. A simple and intuitive interface was designed to display system status and ensure smooth interaction between the user and the hardware.
One major challenge was dealing with noisy EEG data, since brain signals are highly variable and sensitive to interference. In order to have high classification accuracy, careful signal processing and repeated testing. I am proud of building a fully functional end-to-end system where brain signals directly control a physical robotic arm, and the project goes beyond a concept and demonstrates real-world applicability.
Built With
- bcis
- eeg
- microcomputer
- python
- robotics
- supportvectormachine(svm)
Log in or sign up for Devpost to join the conversation.