Inspiration
We were inspired by the concept of Interceptive Resonance, the silent, biological conversation that happens in a room but is often "invisible" to those with social anxiety or neurodivergence. As a first-generation student, sometimes being in networking events is really difficult because feelings of imposter syndrome are prevalent. This leads to having social anxiety and even affects mental health. We wanted to build a tool that gives people a "sixth sense", turning overwhelming emotional noise into a clear, navigable map.
What it does
EchoFlo is a relational data engine that treats a room of people like a live graph database. Using AR and haptics, it:
- Visualizes the Invisible: Maps individuals and has an aura of color which is influenced by their emotions and facial expressions
- Allows users to see the "Space" vibe and people interacting
How we built it
We architected the system by first prototyping different designs with careful consideration on the colors used. We made sure to choose colors that were pleasing to the eyes and not use any harsh colors. We iterated our design about 4 times before going into FigmaMarke. ECHO FLO consists of an app view with a home page, Analytics, Settings and Biometric Health dashboard. It is supposed to be used with Echo Lenses which would be a separate tool. The glasses would allow a person to see the emotions and vibes of a room once they enter and see others arounds them. The Live and Scene parts is what you should be able to see once the lenses are on.
Challenges we ran into
The biggest hurdle was data overload. We realized that showing everything like heart-rates, breathing rates, etc would actually increase social anxiety. We had to iterate on our “Sensory Control” settings, creating a vibe opacity slider so users could dim the data and focus on the human connection.
Accomplishments that we're proud of
We are incredibly proud of creating our logo and learning how to use figma for the first time for some of us. Moving beyond just seeing the problem to actually solving it through Figma Make felt like a breakthrough. It was an amazing experience seeing everyone learn and work together to make this idea into a tool that feels like a legitimate medical and wellness aid.
What we learned
We learned about Lidar, and that “Clear is Kind”. For someone with ADHD, knowing that a room is “High Friction” is far more helpful than being told the room is “Tense”. We also discovered the power of Haptic Sonification, that you don’t always need to see the data to feel the safety of a synchronized group.
What's next for ECHO FLO
The team hopes to build this as a deployed app to expand the interface to machine learning to predict a "Social Cortisol Spike" before it happens, prompting the user to take a break. We hope to ship this out to the app store in the upcoming months.
Built With
- figmamake
- react
- typescript

Log in or sign up for Devpost to join the conversation.