Emotions are often difficult to understand in the moment.
During stressful situations like exams, deadlines, or personal challenges, people tend to suppress or misinterpret their feelings rather than reflect on them.
Most existing mental health tools either feel overwhelming, invasive, or rely on black-box AI systems that provide little explanation.
I wanted to build something simple, ethical, and transparent — a tool that helps users pause, reflect, and understand their emotions without judgment.
That idea led to MindMirror AI.
🧠 What It Does
MindMirror AI is an explainable emotional reflection web app that helps users:
- Express emotions through text or voice
- Identify their primary and secondary emotions
- Understand why an emotion was detected
- See emotion intensity on a visual scale
- Track mood changes over time
- Receive gentle, emotion-aware suggestions for mood regulation
The system is privacy-first — no accounts, no tracking, and no data storage.
⚙️ How I Built It
The project is divided into two main parts:
🔹 Backend (Python + Flask)
The backend uses a rule-based emotion inference engine that analyzes text using:
- Weighted emotional keywords
- Negation handling (e.g. “not happy”)
- Intensity cues (exclamation marks, capital letters, intensifier words)
- Multi-emotion detection (primary + secondary)
- Confidence scoring
Each prediction includes a human-readable explanation, making the AI fully explainable rather than a black box.
The backend also tracks emotional transitions and mood trends to detect improvement or escalation.
🔹 Frontend (HTML, CSS, JavaScript)
The frontend is a lightweight, framework-free web interface designed for calm interaction.
Key UI features include:
- Emotion-based color themes
- Intensity visualization bar
- Mood history chart using Chart.js
- Voice input via browser speech recognition
- Accessible and minimal design
The frontend communicates with the backend through a simple REST API and updates the UI dynamically in real time.
🧩 Challenges I Faced
One major challenge was designing an emotional AI system that felt supportive without being prescriptive.
Other challenges included:
- Handling mixed and negated emotions correctly
- Avoiding medical or diagnostic claims
- Balancing simplicity with meaningful insight
- Creating a calm UI under time constraints
- Making the AI explainable instead of opaque
Each challenge required careful design choices rather than just technical solutions.
📚 What I Learned
Through this project, I learned:
- How to design human-centered and ethical AI systems
- Practical emotion analysis using NLP concepts
- Frontend–backend integration without heavy frameworks
- The importance of explainability in AI
- How small UI decisions affect emotional experience
Most importantly, I learned that good AI is not just about accuracy, but about trust and clarity.
🔮 Future Improvements
Planned enhancements include:
- More advanced emotion modeling
- Long-term emotional trend analysis
- Personalized reflection tones
- Mobile-first experience
- Optional local-only data persistence
🏁 Closing Note
MindMirror AI is not meant to replace professional mental health support.
It is designed as a reflection and awareness tool that encourages users to better understand their emotions in a safe and respectful way.
Log in or sign up for Devpost to join the conversation.