Inspiration

We all struggle with maintaining focus in a world filled with constant distractions — from short-form content to endless notifications. MindsEye was inspired by the idea that technology, rather than being the cause of distraction, could also be part of the solution. We wanted to explore how data from the brain and eyes could help visualize, measure, and ultimately improve human attention.

What it does

MindsEye captures real-time EEG brainwave data and pupil diameter to estimate a user’s focus level. It visualizes this data in an interactive dashboard and provides personalized insights to help users understand their cognitive state. The system can also be used in live “focus competitions” with friends, turning self-improvement into a shared and motivating experience.

How we built it

We built a full-stack system consisting of a React/Electron frontend, a FastAPI backend, and a machine learning model using XGBoost. The Muse EEG headband streams raw brainwave data, which we combine with eye-tracking measurements. The backend processes this multimodal input, extracts features in real time, and sends predictions to the frontend via WebSocket for live visualization.

Challenges we ran into

Integrating the EEG hardware with the web environment was challenging due to Bluetooth and browser security limitations. Calibrating the model for individual users also required careful preprocessing and data cleaning. Finally, synchronizing brain and pupil data streams with millisecond precision was difficult but essential for accurate predictions.

Accomplishments that we're proud of

We successfully built a working end-to-end system capable of analyzing real brain and eye data to infer attention in real time. Our calibration method improved prediction accuracy across users, and our interface turned an abstract concept focus into something visible, gamified, and interactive.

What we learned

We learned how to process and analyze biosignals, how to design responsive and meaningful data visualizations, and how challenging but rewarding multimodal fusion can be. Most importantly, we gained a deeper appreciation for how technology can be used not only to measure the mind but also to help train it.

What's next for MindsEye

We plan to improve calibration accuracy, expand our dataset, and integrate adaptive feedback loops that help users refocus when their attention drifts. Long-term, we envision MindsEye evolving into a comprehensive focus training platform — combining neuroscience, machine learning, and gamified wellness to help people take back control of their attention.

Share this project:

Updates