inSIGHT
Real-time customer sentiment analysis through wearable technology and AI-powered conversation coaching.
Inspiration
Customer feedback arrives too late. By the time companies identify problems, damage is already done. Facial expressions reveal emotions before customers verbalize them. We built a system that analyzes these signals in real time through wearable camera glasses. Users receive instant sentiment feedback and conversation guidance through a mobile app, replacing the traditional survey-based feedback cycle.
What it does
The platform analyzes customer sentiment through facial expression recognition. An ESP32 camera embedded in wearable glasses captures facial expressions during conversations. The backend processes this visual data to generate real-time sentiment scores.
Users receive immediate feedback on their mobile device. The app displays detected emotions and provides actionable conversation recommendations based on sentiment trends.
Product teams gain quantified emotional responses during user testing. The data shows exactly when users experience confusion, frustration, or satisfaction during prototype interactions.
Customer experience teams get an early warning system. The platform identifies recurring negative sentiment patterns across multiple interactions. Service representatives receive real-time guidance when sentiment scores decline.
The analytics dashboard shows sentiment trends and interaction quality metrics over time.
How we built it
The ESP32 camera module transmits video to a Raspberry Pi via wireless connection. The Pi streams the footage to our backend server for processing. This setup keeps the ESP32 lightweight while providing sufficient processing power.
The Python backend deploys three specialized AI models working as coordinated agents. The face recognition model leverages DeepFace, a deep learning facial recognition framework built on VGGFace, Facenet, and OpenFace architectures, to identify and track individuals across video frames with consistent identity mapping through temporal sequences. The facial expression model employs a fine-tuned CNN to classify seven distinct emotional states (anger, disgust, fear, happiness, sadness, surprise, neutral) with 85% accuracy. The sentiment analysis model aggregates multi-frame expression data through a recurrent neural network (RNN) pipeline, applying temporal weighting to generate composite sentiment scores and contextual recommendations based on emotional trajectory analysis.
The Flask API serves data through WebSocket connections to both the React web dashboard and mobile app. We optimized the mobile app to maintain sub-second latency so recommendations stay relevant during active conversations.
Challenges we ran into
Real-time processing created latency issues. We solved this through a multithreaded architecture for parallel facial detection and expression analysis. Mobile feedback requirements demanded additional optimization to keep response times quick.
Building the recommendation engine required analyzing sentiment trends, not just current emotions. The system evaluates whether sentiment is improving or declining to generate appropriate guidance.
Model calibration across different users proved difficult. We tuned the system to account for variation in facial expressions while maintaining consistent scores across lighting conditions and face angles.
Accomplishments that we're proud of
We achieved real-time sentiment analysis on budget hardware. ESP32 are very affordable while delivering professional-grade emotional tracking.
The mobile coaching system provides live conversation guidance through wearable glasses and smartphones.
Our model achieves 85% accuracy in emotion classification, surpassing many commercial solutions.
The interface requires minimal training. Users understand sentiment metrics and recommendations immediately.
The system is production-ready. Companies could deploy it without additional development.
What we learned
Hardware constraints forced better design decisions. Working within ESP32 limits made us figure out creative solutions
Emotional analysis requires context. The same expression means different things during support calls versus product testing.
Real-time systems need a stream-based architecture. We transitioned from batch processing to continuous data flow, which significantly altered our entire system design.
What's next for inSIGHT
Add predictive analytics by analyzing patterns over time. The system could forecast a decline in sentiment and enable proactive intervention.
Enhance the recommendation engine with context awareness. Different guidance for sales calls, support interactions, and product testing would improve relevance.
Implement privacy controls and data anonymization to meet regulatory requirements for deployment in regulated industries.
Built With
- deepface
- esp32
- fer
- flask
- javascript
- python
- raspberry
- react
- reactnative
- supabase
- tensorflow

Log in or sign up for Devpost to join the conversation.