INSPIRATION When I moved to the United States for my education, I felt isolated from my loved ones and everything familiar. Over time, I began showing signs of anxiety and depression, but it often went unnoticed. Expressing such emotions is hard, and many people silently go through the same struggles. This personal experience inspired MoodScope — a watchdog for unnoticed emotions. It is designed to identify individuals showing prolonged signs of sadness or depression in classrooms, workplaces, or personal environments, enabling early, preventive mental health care.
WHAT IT DOES MoodScope uses computer vision and deep learning to detect, track, and analyze emotions in real time. It detects faces using MTCNN, classifies emotions with a trained CNN model (emotion_detection_model.h5), and visualizes trends over time. Users can use MoodScope in two ways: 1. Real-time webcam emotion detection. 2. Upload a prerecorded video for analysis. MoodScope also allows exporting emotion data as a CSV file for long-term tracking, and the exported data can be re-imported for further analysis. Additionally, it integrates Ollama (LLaMA 3) to generate AI-powered insights and wellness recommendations.
HOW WE BUILT IT • PyQt5 for the user interface • OpenCV and MTCNN for face detection • TensorFlow/Keras CNN for emotion classification • SQLite for emotion logging • Matplotlib for visualization • Ollama (LLaMA 3) for reflective AI insights
All computation runs locally to preserve user privacy. MoodScope is lightweight and can run on low-power devices such as a Raspberry Pi 5 with a simple webcam or phone camera.
CHALLENGES WE RAN INTO • Building a lightweight AI model capable of real-time local inference. • Managing multithreaded PyQt5 operations for smooth video processing. • Designing a real-time analytics dashboard that updates without lag. • Integrating AI-generated insights while keeping the system offline.
ACCOMPLISHMENTS WE ARE PROUD OF • Achieved about 85 percent emotion detection accuracy on local inference. • Created a privacy-first, cross-platform emotion recognition tool. • Integrated AI insights and real-time analytics with no cloud dependency. • Enabled both prerecorded video processing and emotion data export. • Optimized the system to run efficiently on lightweight hardware.
WHAT WE LEARNED • How to fine-tune and deploy on-device CNN models. • How to handle data pipelines for emotion analytics. • How to integrate AI insights into PyQt5 applications. • The importance of empathy and user privacy in AI design.
WHAT’S NEXT FOR MOODSCOPE • Add optional cloud synchronization for long-term tracking. • Improve accuracy with larger datasets such as FER2013+. • Develop group analytics for classrooms and offices. • Expand to mobile and web platforms for wider accessibility.
SUMMARY MoodScope combines computer vision, real-time analytics, and AI-driven reflection to help detect early signs of emotional distress. It provides an efficient, private, and accessible approach to mental health awareness — bridging the gap between silent suffering and timely support.
Built With
- metalapi
- mtcnn
- opencv
- tensorflow
Log in or sign up for Devpost to join the conversation.