Inspiration
Many individuals living with severe mental health conditions—such as bipolar disorder, schizophrenia, and borderline personality disorder—struggle with emotional regulation, loneliness, and daily functioning. Emotional Support Animals can provide comfort and stability, but they also require consistent care, time, and energy. During depressive or manic episodes, even basic responsibilities can feel impossible, which can lead to unintended neglect and overwhelming guilt — ultimately worsening mental health.
We wanted to create a form of emotional support that doesn’t require energy, responsibility, or performance. Something that is gentle, responsive, and always present — especially on the days when reaching out feels too hard.
This became Savo.
What it does
Savo is a personalized emotional-support companion robot that helps users process and regulate their emotions in real time. Users interact with Savo through speech or text, and their input is analyzed using sentiment and tone detection to identify emotional states such as sadness, stress, frustration, or numbness. Based on the detected emotional state, Savo generates a response using the comfort style most appropriate for the user—ranging from affirmations and grounding prompts to humour or logical reframing. Over time, Savo learns which strategies are most effective for each individual and adapts accordingly. All emotional interactions are stored and visualized through a pie-chart dashboard, allowing users to observe patterns in their mood. Savo provides consistent companionship and emotional support without requiring care, energy, or responsibility in return.
How we built it
- Authentication & User Management: We implemented Auth0 to securely authenticate users, ensuring emotional data remains private.
- Emotion Recognition Pipeline: User input is analyzed using Gemini for sentiment and tone classification and FaceAPI for optional facial emotion cues.
- Adaptive Response Generation: We built a dynamic system-prompting response engine that selects comfort strategies based on emotion classification.
- Data Storage & Personalization Layer: Emotional states and chat logs are stored in MongoDB Atlas, enabling historical emotional pattern analysis.
- Analytics Dashboard: We visualized emotional trends with pie charts and distribution summaries, allowing users to gain insight into their emotional states over time.
- Speech Integration: Built-in speech-to-text and text-to-speech allows natural conversation with Savo, making it accessible to more users.
Challenges we ran into
Midway through the hackathon, we had to completely pivot from a hardware-based design to a software-only solution due to insufficient physical components and compatibility issues. This shift required rethinking our system architecture, feature scope, and workflow under tight time constraints. We had to refactor much of our original plan to focus on digital emotion recognition and conversational modeling rather than physical sensor integration. Additionally, connecting our data visualization dashboard—specifically the pie chart analytics—to live program outputs proved challenging, as it required managing asynchronous data flow between the backend and frontend. Despite these hurdles, the pivot strengthened our problem-solving skills and improved our ability to adapt quickly in a high-pressure environment.
Accomplishments that we're proud of
We are proud of creating a gentle, emotionally intelligent companion that can adapt its support style based on real emotional responses. We designed a minimal and soothing interface guided by mental-health-first UI principles to avoid overstimulation and reduce cognitive load. We also successfully integrated multimodal emotion recognition with personalized response patterns, allowing Savo to respond more appropriately to different emotional states. Finally, we built a visual dashboard that promotes self-awareness and emotional reflection by helping users understand how their emotions shift over time.
What we learned
Throughout development, we learned that building effective emotional-support technology requires more than just generating responses. Personalization models must be adaptive, since different users respond to different comfort strategies. We also discovered that multimodal emotion detection is complex — aligning text-based sentiment (Gemini) with facial expression inference (FaceAPI) required calibration to avoid misclassification. Additionally, we recognized the importance of low-cognitive-load interface design, where pacing, colour, and layout influence user stress levels. Finally, we confirmed that data privacy and reliability are essential in mental-health applications; secure authentication, consistent system behaviour, and transparent data handling are foundational to user trust.
What's next for SAVO?
- Mobile companion app + wearable device integration
- Voice tone detection and refined emotional inference models
- Optional therapist integration for between-session emotional insight (opt-in only)
- Partnerships with schools and university wellness centers
- Development of comfort behaviours (animations, gentle motions, expressive cues)
Log in or sign up for Devpost to join the conversation.