For our submissions for natHACKS 2021, the RascalToads team had one goal in mind, "How do we control physical things with our mind?" Most BCI is local between the device and a single computer. We wanted to reach out into the world and do something. Here is what we did. We trained software to classify gestures from a Muse headsets. These included detections like winks, blinks, and brow movements. Now that we had gestures, we needed to get them off our computers. To start that journey, we leveraged webhooks. Webhooks allows web communication between conforming parties. We created the Webhook Configurator(WHC) to bridge the varying modes of communication. Data is sent to the backend from either the Petal Metrics app or the python script. With WHC serving as a translator, we can send these received gestures outward. We can control who, what, and when detections are delivered. In WHC, you can select which detect to listen for. These detects can also be filtered from their raw JSON format or reduced down to a true or false value. There can even be multiple recipients - one brow movement could flip a switch in Austin and control a robot in Alberta. That's exactly what we did. We controlled SwitchBot with IFTTT, LEDs with Arduino, and Sphero with Raspberry Pi. The backend, frontend, and the python scripts were created in this hackathon. The python scripts include new gesture classifications, web interfaces to send and receive webhooks. References to the code are on GitHub.

Share this project:

Updates